spark-deep-learning icon indicating copy to clipboard operation
spark-deep-learning copied to clipboard

problem installing sparkdl on mac

Open mehdimo opened this issue 7 years ago • 1 comments

Using the command $SPARK_HOME/bin/spark-shell --packages databricks:spark-deep-learning:1.1.0-spark2.3-s_2.11 to install sparkdl on mac, I got the following error:

Ivy Default Cache set to: /Users/mm/.ivy2/cache
The jars for the packages stored in: /Users/mm/.ivy2/jars
:: loading settings :: url = jar:file:/Users/mm/Documents/spark-2.3.1-bin-hadoop2.7/jars/ivy-2.4.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
databricks#spark-deep-learning added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent-b87dcf83-cae4-42cb-b2cb-4189dee2a375;1.0
	confs: [default]
	found databricks#spark-deep-learning;0.3.0-spark2.2-s_2.11 in spark-packages
	found databricks#tensorframes;0.2.9-s_2.11 in spark-packages
	found Microsoft#spark-images;0.1 in spark-packages
:: resolution report :: resolve 2095ms :: artifacts dl 10ms
	:: modules in use:
	Microsoft#spark-images;0.1 from spark-packages in [default]
	databricks#spark-deep-learning;0.3.0-spark2.2-s_2.11 from spark-packages in [default]
	databricks#tensorframes;0.2.9-s_2.11 from spark-packages in [default]
	---------------------------------------------------------------------
	|                  |            modules            ||   artifacts   |
	|       conf       | number| search|dwnlded|evicted|| number|dwnlded|
	---------------------------------------------------------------------
	|      default     |   9   |   0   |   0   |   0   ||   3   |   0   |
	---------------------------------------------------------------------

:: problems summary ::
:::: WARNINGS
		module not found: org.apache.commons#commons-proxy;1.0

	==== local-m2-cache: tried

	  file:/Users/mm/.m2/repository/org/apache/commons/commons-proxy/1.0/commons-proxy-1.0.pom

	  -- artifact org.apache.commons#commons-proxy;1.0!commons-proxy.jar:

	  file:/Users/mm/.m2/repository/org/apache/commons/commons-proxy/1.0/commons-proxy-1.0.jar

	==== local-ivy-cache: tried

	  /Users/mm/.ivy2/local/org.apache.commons/commons-proxy/1.0/ivys/ivy.xml

	  -- artifact org.apache.commons#commons-proxy;1.0!commons-proxy.jar:

	  /Users/mm/.ivy2/local/org.apache.commons/commons-proxy/1.0/jars/commons-proxy.jar
        ...
	Server access error at url https://repo1.maven.org/maven2/org/tensorflow/tensorflow/1.3.0/tensorflow-1.3.0.jar (javax.net.ssl.SSLHandshakeException: sun.security.validator.ValidatorException: PKIX path building failed: sun.security.provider.certpath.SunCertPathBuilderException: unable to find valid certification path to requested target)


:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS
Exception in thread "main" java.lang.RuntimeException: [unresolved dependency: org.apache.commons#commons-proxy;1.0: not found, unresolved dependency: org.scalactic#scalactic_2.11;3.0.0: not found, unresolved dependency: org.apache.commons#commons-lang3;3.4: not found, unresolved dependency: com.typesafe.scala-logging#scala-logging-api_2.11;2.1.2: not found, unresolved dependency: com.typesafe.scala-logging#scala-logging-slf4j_2.11;2.1.2: not found, unresolved dependency: org.tensorflow#tensorflow;1.3.0: not found]
	at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1303)
	at org.apache.spark.deploy.DependencyUtils$.resolveMavenDependencies(DependencyUtils.scala:53)
	at org.apache.spark.deploy.SparkSubmit$.doPrepareSubmitEnvironment(SparkSubmit.scala:364)
	at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:250)
	at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:171)
	at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:137)
	at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Any solution? I have both Python 2.7 and 3.

mehdimo avatar Jul 18 '18 18:07 mehdimo

please see my reply to issue#129(https://github.com/databricks/spark-deep-learning/issues/129). Hopefully it would address your issue, as well.

spark-water avatar Feb 05 '20 22:02 spark-water