A foreign catalog in Databricks is a specialized type of catalog that enables users to access and query data stored in external databases as if it were part of their own Databricks workspace. Currently, foreign catalogs can be created for multiple sources, including SQL Server, Synapse Analytics, and more. This feature is particularly valuable as […]
Author: Adrian Chodkowski
Microsoft Fabric: Using Workspace Identity for Authentication
One of the newest features available in Microsoft Fabric is the ability to use Workspace Identity to authenticate with external Data Lake Storage Gen2. I find this to be one of the most important features because it significantly simplifies the entire process of authentication and authorization. Workspace Identity is not a new concept; it was […]
Executing SQL queries from Azure DevOps using Service Connection credentials
Executing SQL queries on Azure SQL Database using Azure Devops can be a complex and challenging task, particularly when it comes to establishing a secure and reliable connection. This option can be valuable in numerous scenarios. For instance, you can insert pipeline metadata into the database to track deployment information or changes. This technique also […]
Setup Git credentials for Service Principal in Azure Databricks
Introduction Databricks Jobs can execute code stored locally (1 on the picture below) or stored in a remote Git repository (2). Second approach simplifies the creation and management of production jobs while enabling automated continuous deployment. It eliminates the need to create and maintain a separate production repository within Azure Databricks, reducing the burden of […]