%%scala magic is busted
I'm working on a sizable PR to add doc and clean up various code paths. Spotted this along the way.
40 self._interp = get_scala_interpreter()
41 # Ensure that spark is available in the python session as well.
---> 42 self.kernel.cell_magics['python'].env['spark'] = self._interp.spark_session
43 self.kernel.cell_magics['python'].env['sc'] = self._interp.sc
44
KeyError: 'python'
I'm on the fence about fixing it vs removing support for IPython-to-Scala entirely from this package with the goal of making it more single-purpose: a minimal yet solid Scala+Spark kernel. Pixiedust and https://github.com/maxpoint/spylon can already be used to do scala in ipython.
So spylon can't really do scala in Ipython.
Clarifying: spylon gives you access to JVM views in IPython. Pixiedust does eval of Scala code.
Also, I know @patrick-nicholson is using a simple pattern like:
import spylon_kernel
intp = initialize_scala_interpreter()
intp.interpret(code)
intp.last_result()
which will continue to work without the magic, and can be more easily embedded in Python functions and classes.