Introduction
Migrating existing Delta Lake tables to Unity Catalog allows for centralized governance, fine-grained access control, and multi-cloud data sharing. However, errors, permissions issues, and schema conflicts can prevent successful upgrades.
🚨 Common issues when upgrading Delta Lake tables to Unity Catalog:
- “Table not found in Unity Catalog.”
- “Cannot convert table to Unity Catalog: Invalid metadata.”
- “Permission denied: Unable to upgrade Hive metastore table.”
- “Conflicting schemas: Migration blocked due to metadata mismatch.”
This guide provides troubleshooting steps and solutions to successfully upgrade Delta Lake tables to Unity Catalog.
1. Check If Unity Catalog Is Enabled in Your Workspace
Symptoms:
- Error: “Unity Catalog is not enabled in this workspace.”
SHOW CATALOGS;
returns an empty list.
Causes:
- Unity Catalog has not been enabled in your Databricks workspace.
- Your workspace is still using the legacy Hive Metastore.
Fix:
✅ Verify Unity Catalog is enabled:
SHOW CATALOGS;
✅ If Unity Catalog is not available, check your workspace settings:
- Ensure your workspace is on a Databricks Premium or Enterprise plan.
- Assign a metastore to the workspace:
databricks unity-catalog metastores assign --metastore-id <metastore-id> --workspace-id <workspace-id>
✅ Restart your Databricks cluster to apply changes.
2. Ensure the Table Exists in the Hive Metastore
Symptoms:
- Error: “Table does not exist in Hive Metastore.”
SHOW TABLES
returns an empty result, even though the table exists in Delta format.
Causes:
- The table exists as a Delta Lake table but is not registered in Hive Metastore.
- The table was created using a storage path (
LOCATION
) instead of a managed schema.
Fix:
✅ Check if the table is registered in the Hive Metastore:
SHOW TABLES IN hive_metastore.default;
✅ If missing, register the Delta table in the Hive Metastore:
CREATE TABLE hive_metastore.default.my_table USING DELTA LOCATION 's3://mybucket/my_table/';
✅ Verify the table format before migrating:
DESCRIBE DETAIL hive_metastore.default.my_table;
3. Convert Delta Lake Tables to Unity Catalog
Symptoms:
- Error: “Cannot convert table: Schema metadata mismatch.”
- Table migration partially completes but is not accessible in Unity Catalog.
Causes:
- Schema does not match Unity Catalog requirements.
- Table was created with unsupported configurations (e.g., unsupported file format).
Fix:
✅ Use the ALTER TABLE ... CONVERT TO DELTA
command for conversion:
ALTER TABLE hive_metastore.default.my_table CONVERT TO DELTA;
✅ If migration fails, manually move the table to Unity Catalog:
CREATE TABLE my_catalog.my_schema.my_table AS SELECT * FROM hive_metastore.default.my_table;
✅ Check for schema conflicts and resolve them before migration:
DESCRIBE DETAIL hive_metastore.default.my_table;
4. Fix Permission Issues During Migration
Symptoms:
- Error: “Permission denied: Cannot convert table to Unity Catalog.”
- Databricks workspace admin cannot upgrade tables.
Causes:
- The Databricks user lacks permission to modify the table.
- The service principal or cluster does not have access to the storage location.
Fix:
✅ Grant required permissions on the Hive Metastore table:
GRANT ALL PRIVILEGES ON TABLE hive_metastore.default.my_table TO `user@example.com`;
✅ Ensure the Databricks workspace has permissions on cloud storage:
- AWS S3 IAM Policy:
{
"Effect": "Allow",
"Action": ["s3:GetObject", "s3:PutObject", "s3:ListBucket"],
"Resource": "arn:aws:s3:::mybucket/*"
}
- Azure Storage Account Role Assignment:
az role assignment create --assignee <databricks-service-principal> --role "Storage Blob Data Contributor" --scope /subscriptions/<sub-id>/resourceGroups/<rg>/providers/Microsoft.Storage/storageAccounts/<storage-name>
✅ Check workspace role assignments for Unity Catalog:
SHOW GRANTS ON CATALOG my_catalog;
✅ Manually add workspace users to Unity Catalog:
GRANT USE CATALOG ON CATALOG my_catalog TO `user@example.com`;
5. Resolve Conflicting Table Schemas
Symptoms:
- Error: “Migration failed: Table schema mismatch.”
- Column types in Unity Catalog differ from the source table.
Causes:
- Column data types do not match Unity Catalog schema requirements.
- Different partitions, indexes, or primary keys are used in the source table.
Fix:
✅ Check and compare table schemas before migrating:
DESCRIBE TABLE hive_metastore.default.my_table;
DESCRIBE TABLE my_catalog.my_schema.my_table;
✅ Manually adjust column data types to match Unity Catalog requirements:
ALTER TABLE my_catalog.my_schema.my_table CHANGE COLUMN amount DECIMAL(10,2);
✅ If schema conflicts persist, create a new table and copy data:
CREATE TABLE my_catalog.my_schema.my_table AS SELECT * FROM hive_metastore.default.my_table;
6. Check Unity Catalog Metastore Configuration
Symptoms:
- Error: “No Unity Catalog metastore found.”
- Metastore appears empty even after migration.
Causes:
- Unity Catalog metastore is not assigned to the workspace.
- Metastore is not properly configured with AWS Glue or Azure Storage.
Fix:
✅ Verify metastore assignment in Unity Catalog:
SHOW METASTORES;
✅ Assign a metastore to the workspace (Admin required):
databricks unity-catalog metastores assign --metastore-id <metastore-id> --workspace-id <workspace-id>
✅ If using AWS, ensure the IAM role has Glue permissions:
{
"Effect": "Allow",
"Action": ["glue:GetDatabase", "glue:GetTable", "glue:CreateTable"],
"Resource": "*"
}
7. Migrating External Tables to Unity Catalog
Symptoms:
- Tables using
LOCATION
fail to migrate to Unity Catalog. - Tables referencing external storage remain in Hive Metastore.
Causes:
- Unity Catalog only supports managed tables by default.
- External tables need to be explicitly migrated with correct permissions.
Fix:
✅ If the table is external, re-create it in Unity Catalog with the correct storage location:
CREATE EXTERNAL TABLE my_catalog.my_schema.my_table LOCATION 's3://mybucket/my_table/';
✅ Grant Unity Catalog permission to manage the external location:
GRANT USAGE ON STORAGE LOCATION 's3://mybucket/' TO `user@example.com`;
Best Practices for Migrating Delta Tables to Unity Catalog
✅ Ensure Unity Catalog Is Enabled in Your Workspace
SHOW CATALOGS;
✅ Check and Convert Hive Metastore Tables to Unity Catalog Format
ALTER TABLE hive_metastore.default.my_table CONVERT TO DELTA;
✅ Fix Permission Issues Before Migration
GRANT ALL PRIVILEGES ON TABLE hive_metastore.default.my_table TO `user@example.com`;
✅ Ensure Cloud Storage Permissions Allow Access
- AWS: IAM roles must allow S3 and Glue access.
- Azure: Assign Storage Blob Data Contributor permissions.
✅ Use Schema Validation Before Migration
DESCRIBE TABLE hive_metastore.default.my_table;
DESCRIBE TABLE my_catalog.my_schema.my_table;
Conclusion
If existing Delta Lake tables cannot be upgraded to Unity Catalog, check:
✅ Unity Catalog is enabled and a metastore is assigned.
✅ Hive Metastore tables are properly registered before migration.
✅ Permissions are set for Databricks and cloud storage.
✅ Table schemas match Unity Catalog requirements.
By following this guide, you can successfully migrate Delta Lake tables to Unity Catalog for enhanced security and governance.