[ISSUE] Notebook-Native authentication error
Description
ValueError: default auth: cannot configure default credentials
Reproduction
from databricks.sdk import WorkspaceClient
w = WorkspaceClient()
Expected behavior No error to be thrown
Debug Logs
DEBUG:databricks.sdk:/root/.databrickscfg does not exist
DEBUG:databricks.sdk:Attempting to configure auth: pat
DEBUG:databricks.sdk:Attempting to configure auth: basic
DEBUG:databricks.sdk:Attempting to configure auth: oauth-m2m
DEBUG:databricks.sdk:Attempting to configure auth: azure-client-secret
DEBUG:databricks.sdk:Attempting to configure auth: azure-cli
DEBUG:databricks.sdk:Attempting to configure auth: external-browser
DEBUG:databricks.sdk:Attempting to configure auth: bricks-cli
Other Information Running on databricks single-node, single-user cluster:
-
Runtime:
- DBR 13.2
- Spark 3.4.0
- Scala 2.12
-
Driver:
- c5d.2xlarge
- 16 GB
- 8 Cores
Additional context
@mattfysh can you also post the cluster config? most likely it's already fixed by ae3cb3fcca7c3554c3d4d231666f000ee5761fc5
This is reproducible on DBR 13.2
"num_workers": 0,
"cluster_name": "[email protected]'s Cluster",
"spark_version": "13.2.x-scala2.12",
"spark_conf": {
"spark.master": "local[*, 4]",
"spark.databricks.cluster.profile": "singleNode"
},
"aws_attributes": {
"first_on_demand": 1,
"availability": "SPOT_WITH_FALLBACK",
"zone_id": "auto",
"spot_bid_price_percent": 100,
"ebs_volume_count": 0
},
"node_type_id": "i3.xlarge",
"driver_node_type_id": "i3.xlarge",
"ssh_public_keys": [],
"custom_tags": {
"ResourceClass": "SingleNode"
},
"spark_env_vars": {},
"autotermination_minutes": 30,
"enable_elastic_disk": false,
"cluster_source": "UI",
"init_scripts": [],
"single_user_name": "[email protected]",
"enable_local_disk_encryption": false,
"data_security_mode": "SINGLE_USER",
"runtime_engine": "PHOTON",
"cluster_id": "0804-173133-vk7wvz5x"
}
ValueError: default auth: cannot configure default credentials
ValueError Traceback (most recent call last)
File /databricks/python/lib/python3.10/site-packages/databricks/sdk/core.py:669, in Config._init_auth(self)
668 try:
--> 669 self._header_factory = self._credentials_provider(self)
670 self.auth_type = self._credentials_provider.auth_type()
File /databricks/python/lib/python3.10/site-packages/databricks/sdk/core.py:319, in DefaultCredentials.call(self, cfg)
318 raise ValueError(f'{auth_type}: {e}') from e
--> 319 raise ValueError('cannot configure default credentials')
ValueError: cannot configure default credentials
The above exception was the direct cause of the following exception:
ValueError Traceback (most recent call last)
File /databricks/python/lib/python3.10/site-packages/databricks/sdk/core.py:390, in Config.init(self, credentials_provider, product, product_version, **kwargs)
389 self._validate()
--> 390 self._init_auth()
391 self._product = product
File /databricks/python/lib/python3.10/site-packages/databricks/sdk/core.py:674, in Config._init_auth(self)
673 except ValueError as e:
--> 674 raise ValueError(f'{self._credentials_provider.auth_type()} auth: {e}') from e
ValueError: default auth: cannot configure default credentials
The above exception was the direct cause of the following exception:
ValueError Traceback (most recent call last)
File
For DBR 13.3 LTS ML the default SDK version is 0.1.6 which fails to automatically authenticate. You have to manually install the latest 0.7.1 in your notebook which does automatically authenticate. Would be great to resolve this issue.
%pip install -U databricks-sdk
dbutils.library.restartPython()
DBR 14.1 ML still has SDK 0.1.6 (2023-05-10) as the default which doesn't have authentication. Would help drive SDK customer adoption if the default had auto authentication working.