Introduction
The TestUC1002 – Invalid Credentials error in Databricks Unity Catalog indicates a failure to authenticate or access Unity Catalog services due to incorrect or missing credentials. This error commonly occurs when configuring Unity Catalog, accessing cloud storage, or assigning a metastore to a workspace. If not resolved, it may block operations such as creating catalogs, managing tables, or accessing external storage.
🚨 Common symptoms of TestUC1002:
- “TestUC1002 – Invalid Credentials” error during Unity Catalog setup.
- Cannot attach Unity Catalog metastore to a workspace.
- Access to cloud storage (S3, ADLS, GCS) fails due to missing permissions.
- Jobs or notebooks fail to access Unity Catalog tables.
Common Causes and Fixes for TestUC1002
1. Incorrect IAM Role or Missing Permissions (AWS)
Symptoms:
- Error: “TestUC1002 – Invalid Credentials” while assigning Unity Catalog metastore.
- Databricks cannot access S3 buckets or AWS Glue catalog.
Causes:
- IAM role does not have the required permissions for S3 or AWS Glue.
- The IAM role is not correctly attached to the Databricks cluster.
Fix:
✅ Ensure the IAM role has the correct permissions for Unity Catalog:
{
"Effect": "Allow",
"Action": [
"glue:GetCatalog*",
"s3:GetObject",
"s3:PutObject",
"s3:ListBucket"
],
"Resource": "*"
}
✅ Update the IAM role policy and attach it to the cluster:
aws iam put-role-policy --role-name <your-role-name> --policy-name UnityCatalogAccess --policy-document file://policy.json
✅ Verify that the IAM role is attached to the Databricks workspace:
databricks unity-catalog metastores create --region <region> --s3-bucket <bucket-name>
2. Missing Azure Storage or Key Vault Permissions (Azure)
Symptoms:
- Error: “Invalid credentials while accessing Unity Catalog-backed Azure Data Lake Storage.”
- Cannot access Azure Key Vault secrets in Databricks.
Causes:
- Managed Identity or Service Principal lacks permissions for Azure Data Lake Storage or Key Vault.
- Key Vault access policy is not properly configured.
Fix:
✅ Assign the correct role to your Databricks service principal:
az role assignment create --assignee <service-principal-id> --role "Storage Blob Data Contributor" --scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.Storage/storageAccounts/<storage-name>
✅ Ensure Azure Key Vault allows secret access:
az keyvault set-policy --name <keyvault-name> --spn <service-principal-id> --secret-permissions get list
✅ Verify that your Azure Data Lake storage account is accessible from Databricks:
az storage account list --resource-group <resource-group-name>
3. Invalid Credentials for Google Cloud Storage (GCS)
Symptoms:
- Error: “TestUC1002 – Invalid Credentials” while accessing Google Cloud Storage.
- Jobs fail when writing to GCS buckets.
Causes:
- Service account key is missing or incorrectly configured.
- GCS bucket permissions do not allow Databricks access.
Fix:
✅ Ensure the service account has the correct permissions:
{
"role": "roles/storage.objectAdmin",
"members": ["serviceAccount:databricks-sa@my-project.iam.gserviceaccount.com"]
}
✅ Download and configure the service account key:
gcloud iam service-accounts keys create key.json --iam-account databricks-sa@my-project.iam.gserviceaccount.com
✅ Attach the key to the Databricks cluster:
spark.conf.set("fs.gs.auth.service.account.json.keyfile", "/dbfs/path/to/key.json")
4. Incorrect Cluster Configuration
Symptoms:
- Error: “Invalid credentials” when running Unity Catalog commands in notebooks.
- Cannot access Unity Catalog tables from SQL Warehouses.
Causes:
- Clusters are not configured to support Unity Catalog.
- Legacy clusters using Hive Metastore conflict with Unity Catalog.
Fix:
✅ Enable Unity Catalog for the cluster:
- Go to Databricks UI → Clusters → Edit Cluster
- Enable Unity Catalog support in the advanced options.
- Restart the cluster.
✅ For SQL Warehouses, ensure Unity Catalog is enabled:
- Go to Databricks UI → SQL Warehouses → Edit → Advanced Settings.
Step-by-Step Troubleshooting Guide
1. Verify Cluster and Workspace Configuration
databricks workspace get-status
2. Check If the IAM Role or Service Principal Has Correct Permissions
aws iam get-role --role-name <your-role-name>
az role assignment list --assignee <service-principal-id>
3. Ensure Unity Catalog Is Enabled on the Cluster
Restart the cluster and ensure Unity Catalog support is enabled.
4. Test Access to Cloud Storage and Key Vault
aws s3 ls s3://<bucket-name>
az keyvault secret list --vault-name <keyvault-name>
Best Practices to Avoid TestUC1002 – Invalid Credentials Error
✅ Use Managed Identities or Service Principals for Secure Access
- Avoid using hardcoded credentials for cloud storage and secrets.
✅ Ensure IAM Roles and Azure AD Permissions Are Correct
- Regularly audit permissions for AWS S3, Azure ADLS, and Key Vault access.
✅ Enable Unity Catalog Support for All Clusters
- Avoid running Unity Catalog commands on legacy clusters.
✅ Monitor Cloud Storage and Key Vault Access Logs
- Track access logs for failed authentication attempts.
Conclusion
The TestUC1002 – Invalid Credentials error in Databricks Unity Catalog typically arises from incorrect IAM permissions, missing service principal roles, or cluster misconfigurations. By following the steps outlined above—verifying cloud storage permissions, enabling Unity Catalog support on clusters, and checking IAM roles—you can resolve this error and ensure smooth Unity Catalog operations.
Leave a Reply