zingg icon indicating copy to clipboard operation
zingg copied to clipboard

databricks connect not working

Open vikasgupta78 opened this issue 2 years ago • 0 comments

(.venv) vikasgupta@Vikass-MacBook-Air /tmp % databricks-connect test

  • PySpark is installed at /opt/homebrew/lib/python3.10/site-packages/pyspark
  • Checking SPARK_HOME
  • Checking java version java version "1.8.0_351" Java(TM) SE Runtime Environment (build 1.8.0_351-b10) Java HotSpot(TM) 64-Bit Server VM (build 25.351-b10, mixed mode)
  • Testing scala command 23/12/29 12:09:52 WARN Utils: Your hostname, Vikass-MacBook-Air.local resolves to a loopback address: 127.0.0.1; using 192.168.68.102 instead (on interface en0) 23/12/29 12:09:52 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 23/12/29 12:10:00 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Spark context Web UI available at http://192.168.68.102:4040 Spark context available as 'sc' (master = local[*], app id = local-1703832001301). Spark session available as 'spark'. Welcome to ____ __ / / ___ / / \ / _ / _ `/ __/ '/ // .__/_,// //_\ version 3.5.0 //

Using Scala version 2.12.18 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_351) Type in expressions to have them evaluated. Type :help for more information.

scala>

scala> import com.databricks.service.SparkClientManager :22: error: object databricks is not a member of package com import com.databricks.service.SparkClientManager ^

scala> val serverConf = SparkClientManager.getForCurrentSession().getServerSparkConf :22: error: not found: value SparkClientManager val serverConf = SparkClientManager.getForCurrentSession().getServerSparkConf ^

scala> val processIsolation = serverConf .get("spark.databricks.pyspark.enableProcessIsolation") :22: error: not found: value serverConf val processIsolation = serverConf .get("spark.databricks.pyspark.enableProcessIsolation") ^

scala> if (!processIsolation.toBoolean) { | spark.range(100).reduce((a,b) => Long.box(a + b)) | } else { | spark.range(99*100/2).count() | } :23: error: not found: value processIsolation if (!processIsolation.toBoolean) { ^

scala>
| scala> :quit

Traceback (most recent call last): File "/opt/homebrew/bin/databricks-connect", line 8, in sys.exit(main()) File "/opt/homebrew/lib/python3.10/site-packages/pyspark/databricks_connect.py", line 311, in main test() File "/opt/homebrew/lib/python3.10/site-packages/pyspark/databricks_connect.py", line 267, in test raise ValueError("Scala command failed to produce correct result") ValueError: Scala command failed to produce correct result (.venv) vikasgupta@Vikass-MacBook-Air /tmp %

vikasgupta78 avatar Dec 29 '23 06:12 vikasgupta78