The DBSQL Metrics project demonstrates how to use Databricks Asset Bundles with Unity Catalog Metric Views to create an end-to-end analytics solution on Databricks. This project deploys a simple dimensional model sourced from System Tables, Metric Views, and an AI/BI dashboard, all packaged neatly in a bundle.
This project also shows how to parameterize AI/BI dashboard datasets with DABs, which is not natively supported. Follow issue 1915 for updates.
Install the Databricks CLI from https://docs.databricks.com/dev-tools/cli/install.html
Choose one of the following authentication methods:
-
Generate Personal Access Token:
- Log into your Databricks workspace
- Click on your username in the top-right corner
- Select User Settings → Developer → Access tokens
- Click Generate new token
- Give it a name (e.g., "Local Development") and set expiration
- Copy the generated token
-
Configure CLI with PAT:
databricks configure --token --profile DEFAULT
You'll be prompted for:
- Databricks Host:
https://your-workspace.cloud.databricks.com - Token: Paste your generated token
This will update DEFAULT profile in
~/.databrickscfg - Databricks Host:
Configure OAuth:
databricks auth login --host https://your-workspace.cloud.databricks.com --profile PRODThis will:
- Open your browser for authentication
- Create a profile in
~/.databrickscfg - Store OAuth credentials securely
Check your configuration:
# List all profiles
cat ~/.databrickscfgYour ~/.databrickscfg should look like:
[DEFAULT]
host = https://your-workspace.cloud.databricks.com
token = dapi123abc...
[DEV]
host = https://dev-workspace.cloud.databricks.com
token = dapi456def...
[PROD]
host = https://prod-workspace.cloud.databricks.com
token = databricks-cliCreate and activate a Python virtual environment to manage dependencies:
# Create virtual environment
$ python -m venv venv
# Activate virtual environment
# On macOS/Linux:
$ source venv/bin/activate
# On Windows:
$ venv\Scripts\activate
# Install required Python packages (n/a currently)
# $ pip install -r requirements-dev.txtUpdate the variables in databricks.yml to match your environment. The dev target defaults to catalog users and a schema based on on the developer's name.
- catalog: The catalog name where your tables will be created
- schema: The schema name within the catalog
- warehouse_id: ID of your SQL warehouse for production deployment. For development, the bundle will lookup the ID based on the specified name (Eg, Shared Serverless).
- workspace.host: Your Databricks workspace URL
Example configuration for prod target:
targets:
prod:
mode: development
default: true
workspace:
host: https://your-workspace.cloud.databricks.com
variables:
warehouse_id: your_warehouse_id
catalog: your_catalog
schema: your_schema$ databricks bundle deploy --target dev --profile DEFAULTNote: Since "dev" is specified as the default target in databricks.yml, you can omit the --target dev parameter. Similarly, --profile DEFAULT can be omitted if you only have one profile configured for your workspace.
This deploys everything that's defined for this project, including:
- A job called
[dev yourname] dbsql_metrics_job - SQL metric views and dashboards
- All associated resources
You can find the deployed job by opening your workspace and clicking on Workflows.
$ databricks bundle deploy --target prod --profile PROD$ databricks bundle run --target prod --profile PRODFor enhanced development experience, consider installing:
- Databricks extension for Visual Studio Code: https://docs.databricks.com/dev-tools/vscode-ext.html
For comprehensive documentation on:
- Databricks Asset Bundles: https://docs.databricks.com/dev-tools/bundles/index.html
- CI/CD configuration: https://docs.databricks.com/dev-tools/bundles/index.html
assets/: Images for READMEresources/: Bundle resource definitions (Jobs, dashboard)src/: Source files including notebooks and dashboarddatabricks.yml: Main bundle configuration file
