elvaliuliuliu
elvaliuliuliu
Moreover, I saw the error message is `Exception: Python in worker has different version 3.7 than that in driver 0.4.0, PySpark cannot run with different minor versions.Please check environment variables...
@imback82 I will follow up on this.
Can you try set up DOTNET_WORKER_DIR to the local path of Microsoft.Spark.Worker? Please make sure path is set correctly and Microsoft.Spark.Worker.exe is in the path.
@zwitbaum We have been investigating on this scenario. I can reproduce the error you mentioned. Looks like there are some architecture differences which cause SparkDotnet worker is not invoked but...
@zwitbaum: Just FYI, we have opened a ticket with Azure Databricks, this is currently being investigated. I will update you once I hear back from them.
@zwitbaum : We had a discussion with Databricks today and will get back to you once there are any updates from their side.
Which compatible python version you mean here? For example, error shows the python version "0.4.0" mismatches, this version is from `Microsoft.Spark.Worker.exe`. So looks like the logic plan is correct, but...
Thanks @jammman for the repro and comparison with HDI! So we will open a ticket with Databricks team and start thread! Will also update the status here! And as @imback82...