feat: add Databricks Serverless support
Adds support for using Serverless compute from a notebook. I tested by plan/applying using a Serverless environment.
Two key changes:
- Serverless does not support global temp views but does support session temp views. Global is at the application level while session is at the spark session level. I don't believe we support any sharing of state across sessions so changing this to session scoped views should be fine.
- Serverless threw a unique exception when trying to access SparkContext when it was not available so handle this exception too (we already had logic in place for supporting databricks connect).
Note: This PR is build on the current constraints that the Python SQL Connector cannot query serverless. If that changes in the future, then the fix in the PR of having serverless use session temp views would become a problem if you want to mix SQL Connector and Databricks Connect. It might be that the global temp constraint is lifted by then but if not then the hybrid mode of using both would likely need to be disabled when using serverless.
Found out that when using Databricks SQL Connector + Databricks connect then you need the global temp views since they are trying to share temp objects across sessions (but still within the same application). Therefore now I just use the global temp views in all cases except when using databricks serverless.
Also added serverless support to databricks-connect.