A foreign catalog in Databricks is a specialized type of catalog that enables users to access and query data stored in external databases as if it were part of their own Databricks workspace. Currently, foreign catalogs can be created for multiple sources, including SQL Server, Synapse Analytics, and more. This feature is particularly valuable as […]
Latest Posts
Terraforming Databricks #3: Lakehouse Federation
In today’s post, the third in the Terraforming Databricks series, we’ll break down the process of setting a connection to an Azure SQL Database as part of the Lakehouse Federation functionality. Lakehouse Federation Before diving into the implementation, let’s first define what Lakehouse Federation is. Here’s a brief description from the documentation. Lakehouse Federation is […]
Microsoft Fabric: Using Workspace Identity for Authentication
One of the newest features available in Microsoft Fabric is the ability to use Workspace Identity to authenticate with external Data Lake Storage Gen2. I find this to be one of the most important features because it significantly simplifies the entire process of authentication and authorization. Workspace Identity is not a new concept; it was […]
Terraforming Databricks #2: Catalogs & Schemas
In the first post of this series, we discussed the process of creating a metastore, which is essential for enabling workspaces for the Unity Catalog. In this part, I would like to cover the provisioning process of key elements in Unity Catalog’s object model – specifically, catalogs and schemas. The goal of this article is […]
Last comments