- SQL Analytics Endpoint: This is like a special tool that lets you work with data in delta tables using SQL (T-SQL language).
- How to Access: You can access this tool by selecting it in the workspace view or by switching to SQL analytics endpoint mode in Lakehouse Explorer.
- What You Can Do: With this tool, you can analyze data, create functions, make views, and set up security using SQL.
- Creating a Lakehouse: When you create a Lakehouse, you automatically get this SQL tool. Once your data is in a delta table in the Lakehouse, you can start using the SQL tool right away to analyze it.

SQL analytics endpoint read-only mode
The SQL analytics endpoint operates in read-only mode over lakehouse delta tables. You can only read data from delta tables using the SQL analytics endpoint. They can save functions, views, and set SQL object-level security.
External delta tables created with Spark code won’t be visible to the SQL analytics endpoint. Use shortcuts in Table space to make external delta tables visible to the SQL analytics endpoint
Limitations of the SQL analytics endpoint
- Data must be in Delta Parquet format to be automatically found in the SQL analytics endpoint.
- Delta Lake is a storage framework for building a Lakehouse architecture.
- Renamed columns in tables are not supported in the SQL analytics endpoint.
- Tables created with Delta outside of the /tables folder won’t be visible in the SQL analytics endpoint.
- If you don’t see a table in the warehouse, check its location; only tables in the /tables folder are visible.
- Some columns from Spark Delta tables may not appear in SQL analytics tables. Check supported data types for details.
- Adding foreign key constraints in SQL analytics tables may restrict further schema changes.
- If Delta Lake columns aren’t showing as expected, check for foreign key constraints that may be blocking updates.