spylon-kernel icon indicating copy to clipboard operation
spylon-kernel copied to clipboard

stop/start does not work

Open parente opened this issue 8 years ago • 2 comments

For my spylon notebook I:

  1. Did spark.stop()
  2. Did not restart the notebook kernel
  3. Ran all the %%init_spark and spark to start up a Spark application again

what I found was that most operations work, like reading datasets using the sparkSession and showing them and stuff

However, when I tried to use the sparkContext, it thinks it's not running. Here's the code I was running and the error:

val bRetailersList = (sparkSession.sparkContext
                      .broadcast(trainedModel.itemFactors.select("id")
                                 .rdd.map(x => x(0).asInstanceOf[Int]).collect)
                      )

java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:

org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(Constructor.java:423)
py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:240)
py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
py4j.Gateway.invoke(Gateway.java:236)
py4j.commands.ConstructorCommand.invokeConstructor(ConstructorCommand.java:80)
py4j.commands.ConstructorCommand.execute(ConstructorCommand.java:69)
py4j.GatewayConnection.run(GatewayConnection.java:214)
java.lang.Thread.run(Thread.java:745)

The currently active SparkContext was created at:

org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:823)
org.apache.spark.ml.util.BaseReadWrite$class.sparkSession(ReadWrite.scala:69)
org.apache.spark.ml.util.MLReader.sparkSession(ReadWrite.scala:189)
org.apache.spark.ml.util.BaseReadWrite$class.sc(ReadWrite.scala:80)
org.apache.spark.ml.util.MLReader.sc(ReadWrite.scala:189)
org.apache.spark.ml.recommendation.ALSModel$ALSModelReader.load(ALS.scala:317)
org.apache.spark.ml.recommendation.ALSModel$ALSModelReader.load(ALS.scala:311)
org.apache.spark.ml.util.MLReadable$class.load(ReadWrite.scala:227)
org.apache.spark.ml.recommendation.ALSModel$.load(ALS.scala:297)
<init>(<console>:53)
<init>(<console>:58)
<init>(<console>:60)
<init>(<console>:62)
<init>(<console>:64)
<init>(<console>:66)
<init>(<console>:68)
<init>(<console>:70)
<init>(<console>:72)
<init>(<console>:74)
<init>(<console>:76)

  at org.apache.spark.SparkContext.assertNotStopped(SparkContext.scala:101)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:80)
  at org.apache.spark.sql.SparkSession.<init>(SparkSession.scala:77)
  ... 44 elided

parente avatar May 24 '17 03:05 parente

What is mp.sparkSession?

mariusvniekerk avatar May 24 '17 03:05 mariusvniekerk

Sorry, this was transferred from maxpoint/spylon#45. It's a SparkSession.

parente avatar May 24 '17 11:05 parente