databricks spark deeplearning jar conflicts with apache phoenix client jar citing below error
vy Default Cache set to: /home/hadoop_esp_t/.ivy2/cache The jars for the packages stored in: /home/hadoop_esp_t/.ivy2/jars :: loading settings :: url = jar:file:/usr/hdp/2.6.3.0-235/phoenix/phoenix-4.7.0.2.6.3.0-235-client.jar!/org/apache/ivy/core/settings/ivysettings.xml Exception in thread "main" java.lang.NoSuchMethodError: org.apache.ivy.core.module.descriptor.DefaultModuleDescriptor.setDefaultConf(Ljava/lang/String;)V at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1192) at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:304) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
phoenix client jar also has ivy2settings.xml and I think this is causing some kind of conflict.
for a use case that I am working on, I need to be able to access sparkdl module and also apache phoenix tables within my pyspark job. Is there a way we can get jars that package sparkdl that dont have ivysettings.xml etc., ?