[ISSUE] Account-level authentication failed for az cli + sp
Description See issues https://github.com/databricks/terraform-provider-databricks/issues/4063 for more details
Reproduction Account-level provider + sp auth using az cli
Expected behavior Should work, but now failed with
│ Error: cannot read user: failed during request visitor: default auth: azure-cli: cannot get access token: WARNING: Could not retrieve credential from local cache for service principal e0da535f-4aa5-45fd-ad10-0817c932b48c under tenant common. Trying credential under tenant 1e98afad-8153-4889-a48f-60dc77bc87a8, assuming that is an app credential.
│ ERROR: AADSTS50059: No tenant-identifying information found in either the request or implied by any provided credentials. Trace ID: 3e7f0d70-f60c-4276-bbcb-996f4dac2200 Correlation ID: fbb9e0db-dfdc-44f5-95a7-68b18f7d64a6 Timestamp: 2024-10-01 15:50:11Z
│ Interactive authentication is needed. Please run:
│ az login
Is it a regression? This failed in v0.44, likely in PR #910
+1
to work around this issue, in the configuration file, add the line
azure_tenant_id = <azure-service-principal-tenant-id>
So the full profile for account level command is:
[<some-unique-configuration-profile-name>]
host = <account-console-url>
account_id = <account-id>
azure_tenant_id = <azure-service-principal-tenant-id>
This issue is specifically for authenticating via AZ CLI though, rather than configuration file. So i'm not sure that workaround works here does it?
This issue is specifically for authenticating via AZ CLI though, rather than configuration file. So i'm not sure that workaround works here does it?
See here https://learn.microsoft.com/en-us/azure/databricks/dev-tools/cli/authentication#azure-cli-auth You need configuration file to work with azure cli.
So i run off the below configuration purely using Terraform provider config
https://registry.terraform.io/providers/databricks/databricks/latest/docs#authenticating-with-azure-cli.
However adding azure_tenant_id in the provider config does seem to work for me when upgrading to 1.81.1, so i'm assuming Terraform must create the profile file under the hood based off the terraform provider config.
provider "databricks" {
alias = "workspace"
host = var.databricks_workspace
azure_tenant_id = var.azure_tenant_id
}