data-validator icon indicating copy to clipboard operation
data-validator copied to clipboard

Unit tests failures after adding dependency on HiveWarehouseConnector

Open samratmitra-0812 opened this issue 6 years ago • 3 comments

In order to enable data-validator for Hadoop 3, a dependency on HiveWarehouseConnector was added. Post this unit tests started failing with the following exception:

java.lang.SecurityException: class "org.codehaus.janino.JaninoRuntimeException"'s signer information does not match signer information of other classes in the same package
at java.lang.ClassLoader.checkCerts(ClassLoader.java:898)
at java.lang.ClassLoader.preDefineClass(ClassLoader.java:668)
at java.lang.ClassLoader.defineClass(ClassLoader.java:761)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:197)
at org.apache.spark.sql.catalyst.expressions.codegen.GenerateSafeProjection$.create(GenerateSafeProjection.scala:36)
at org.apache.spark.sql.catalyst.expressions.codegen.CodeGenerator.generate(CodeGenerator.scala:1321)
at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$collectFromPlan(Dataset.scala:3272)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$head$1.apply(Dataset.scala:2484)
at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3254)
at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3253)
at org.apache.spark.sql.Dataset.head(Dataset.scala:2484)
at org.apache.spark.sql.Dataset.take(Dataset.scala:2698)

As suggested in this SO thread, HiveWarehouseConnector jar was added to the end of the classpath. Post that, a NoClassDefFoundError showed up.

java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$
[error] sbt.ForkMain$ForkError: java.lang.NoClassDefFoundError: Could not initialize class org.apache.spark.rdd.RDDOperationScope$
[error]     at org.apache.spark.SparkContext.withScope(SparkContext.scala:693)
[error]     at org.apache.spark.SparkContext.parallelize(SparkContext.scala:710)

This seems like a typical jar hell issue. And the issue is only with the unit tests. When the unit test runs were skipped, the data-validator was successfully deployed and ran fine on Hadoop 2 and Hadoop 3.

samratmitra-0812 avatar Nov 18 '19 17:11 samratmitra-0812

@samratmitra-0812 This is for the current HEAD of Hive3_Support branch correct?

If so, I will take a closer look tomorrow.

phpisciuneri avatar Nov 19 '19 21:11 phpisciuneri

@phpisciuneri Yes. That is correct. Thanks for your help.

samratmitra-0812 avatar Nov 20 '19 09:11 samratmitra-0812

Using the HWC is obsoleted for our internal use by the availability in v0.14.0 of the specified format loader. I still want this to work, though, for others who don't have that as an option.

Searching around a bit, this seems to be a result of a JAR clash, perhaps something included in an assembly that probably shouldn't. We might solve this unintentionally with all of the version updates planned for 0.14.0 and 0.15.0.

colindean avatar Jun 14 '22 21:06 colindean