spark-deep-learning icon indicating copy to clipboard operation
spark-deep-learning copied to clipboard

databricks sparkdl library cannot be imported without "--packages"

Open arunnatva opened this issue 8 years ago • 0 comments

Hi, I am using sparkdl module from databricks and everytime, I have to use --packages , and also add the jars to sys.path inside pyspark code in order to successfully import sparkdl. But, ideally it should work if I just add them in python's sys.path, correct ? I tried multiple options like adding the jars to --jars , spark.driver.extraClassPath, spark.executor.extraClassPath along with adding them to sys.path within the code but nothing seems to work unless I use "--packages" in spark-submit.

Can someone take a look at this, as it is easily reproducible

arunnatva avatar Jan 18 '18 18:01 arunnatva