spark-deep-learning icon indicating copy to clipboard operation
spark-deep-learning copied to clipboard

databricks spark deeplearning jar conflicts with apache phoenix client jar citing below error

Open arunnatva opened this issue 8 years ago • 3 comments

arunnatva avatar Jan 18 '18 18:01 arunnatva

vy Default Cache set to: /home/hadoop_esp_t/.ivy2/cache The jars for the packages stored in: /home/hadoop_esp_t/.ivy2/jars :: loading settings :: url = jar:file:/usr/hdp/2.6.3.0-235/phoenix/phoenix-4.7.0.2.6.3.0-235-client.jar!/org/apache/ivy/core/settings/ivysettings.xml Exception in thread "main" java.lang.NoSuchMethodError: org.apache.ivy.core.module.descriptor.DefaultModuleDescriptor.setDefaultConf(Ljava/lang/String;)V at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1192) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:304) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

arunnatva avatar Jan 18 '18 18:01 arunnatva

phoenix client jar also has ivy2settings.xml and I think this is causing some kind of conflict.

arunnatva avatar Jan 18 '18 18:01 arunnatva

for a use case that I am working on, I need to be able to access sparkdl module and also apache phoenix tables within my pyspark job. Is there a way we can get jars that package sparkdl that dont have ivysettings.xml etc., ?

arunnatva avatar Jan 18 '18 18:01 arunnatva