spark-deep-learning
spark-deep-learning copied to clipboard
databricks sparkdl library cannot be imported without "--packages"
Hi, I am using sparkdl module from databricks and everytime, I have to use --packages , and also add the jars to sys.path inside pyspark code in order to successfully import sparkdl. But, ideally it should work if I just add them in python's sys.path, correct ? I tried multiple options like adding the jars to --jars , spark.driver.extraClassPath, spark.executor.extraClassPath along with adding them to sys.path within the code but nothing seems to work unless I use "--packages" in spark-submit.
Can someone take a look at this, as it is easily reproducible