In today’s post, the third in the Terraforming Databricks series, we’ll break down the process of setting a connection to an Azure SQL Database as part of the Lakehouse Federation functionality. Lakehouse Federation Before diving into the implementation, let’s first define what Lakehouse Federation is. Here’s a brief description from the documentation. Lakehouse Federation is […]
Author: Tomasz Kostyrka
Terraforming Databricks #2: Catalogs & Schemas
In the first post of this series, we discussed the process of creating a metastore, which is essential for enabling workspaces for the Unity Catalog. In this part, I would like to cover the provisioning process of key elements in Unity Catalog’s object model – specifically, catalogs and schemas. The goal of this article is […]
Terraforming Databricks #1: Unity Catalog Metastore
Over the past two years, we have participated in numerous projects where Azure Databricks was implemented from the ground up. Each of these deployments allowed us to learn something new, verify previous solutions, and ultimately develop a methodology that allows us to deploy Azure Databricks in a standardized, enterprise-scale-ready manner. As a result, the newly […]
Utilizing YAML Anchors in Databricks Asset Bundles
We all know what YAML is – it’s like JSON, just with indentation instead of brackets. Easier to write and read. That’s it, isn’t it? In most situations… yes. But if we look a little deeper, we’ll find features that many people have no idea exist. And let me emphasize right away, I’m not judging […]