Native Support for Unity Catalog Metric Views
📌 Summary
We would like to propose native support for Databricks Unity Catalog Metric Views in dbt-databricks, allowing users to define metric views directly in the model.yml files using a YAML-based configuration (aligned with the official Databricks Metric View YAML Reference).
The idea is to extend dbt-databricks with a macro-based mechanism (or native materialization) that automatically generates CREATE METRIC VIEW statements after model runs, based on metadata stored in meta.metric_view.
✅ Motivation
- Metric Views in Unity Catalog allow structured, governed metric definitions close to the data, ideal for semantic layer modeling and BI tool integrations (e.g., Power BI, Tableau).
- dbt already defines the business logic and table structure – Metric Views are a natural extension.
- Users would benefit from defining metric metadata next to their models, without needing to maintain SQL scripts separately.
- This promotes DRY modeling, observability, and automated quality metrics across the data platform.
🧱 Proposed Solution
We suggest one of the following approaches:
Option A – Macro-based Implementation
A macro like generate_metric_views() (invoked via on-run-end or dbt run-operation) reads the model meta.metric_view entries and creates Metric Views via dynamic SQL.
Option B – New Materialization metric_view
Users define dedicated models with materialized: metric_view, and the adapter compiles them into Metric View SQL using Databricks' YAML format.
🧬 YAML Example in model.yml
models:
- name: customers
description: "Customer master data"
meta:
metric_view:
enabled: true
name: customer_metrics
description: "Customer KPIs"
filter: "created_at >= current_date() - INTERVAL 30 DAYS"
dimensions:
- name: country
- name: created_at
measures:
- name: total_customers
expression: count(*)
- name: new_customers_last_7d
expression: count_if(created_at >= current_date() - 7)
windows:
- name: rolling_7d
type: trailing
duration: 7 days
🛠️ Macro Example
{% macro generate_metric_views() %}
{% for node in graph.nodes.values() if node.resource_type == 'model' and node.meta.metric_view is defined %}
{% set mv = node.meta.metric_view %}
{% if mv.enabled %}
{% set catalog = node.database %}
{% set schema = node.schema %}
{% set view_name = mv.name %}
{% set table_name = node.alias %}
DROP VIEW IF EXISTS {{ catalog }}.{{ schema }}.{{ view_name }};
CREATE VIEW {{ catalog }}.{{ schema }}.{{ view_name }}
WITH METRICS
LANGUAGE YAML
AS $$
version: 0.1
source: {{ catalog }}.{{ schema }}.{{ table_name }}
{% if mv.filter %}filter: {{ mv.filter }}{% endif %}
dimensions:
{% for dim in mv.dimensions %}
- name: {{ dim.name }}
expr: {{ dim.expr | default(dim.name) }}
{% endfor %}
measures:
{% for meas in mv.measures %}
- name: {{ meas.name }}
expr: {{ meas.expression | default(meas.expr) }}
{% endfor %}
$$;
{% endif %}
{% endfor %}
{% endmacro %}
📎 Hook Integration (optional)
on-run-end: "{{ generate_metric_views() }}"
Or run manually via:
dbt run-operation generate_metric_views
🔍 Considerations
- Support for Metric Views requires Databricks Runtime 16.4+.
- Metric Views are currently in Public Preview, so syntax or behavior may evolve.
- Catalog & schema resolution should respect
targetor node attributes (e.g.node.database,node.schema). - Ideally, we validate required YAML fields (
name,measures,dimensions) and raise compiler errors if missing.
💡 Optional Enhancements
- Add new materialization
metric_viewto show dependencies in the DAG. - Add
dbt metric-view generateas a CLI command. - Create dbt tests for Metric View outputs via
MEASURE()queries.
📚 References
- Databricks Metric Views
- Metric View YAML Reference
- CREATE METRIC VIEW SQL Syntax
- Windowing Measures
- dbt Adapter Plugin Guide
🙏 Why this belongs in dbt-databricks
- It enhances native compatibility with the Databricks Semantic Layer.
- It closes the loop between dbt modeling and metric definitions in Unity Catalog.
- It would encourage best practices for governed KPIs in enterprise-grade environments.
@LSchuetzi84 this is definitely something we want to bring to this library. And thank you for filing a comprehensive issue. For a feature of this scope, we need to coordinate a bit with dbt Labs so it may take us some time, but it's on our radar for the near future.
How (if at all) should this relate to metricflow? Can/should we define metrics in native dbt (metricflow), and convert them on the fly when running to model on Databricks (vs defining them Databricks style in the model.yml)?
@benc-db what is the status quo here? did you have a coordination with dbtLabs on this issue?
The companies are still debating on what is the best way to bring this to dbt-databricks unfortunately. A core challenge is that metricflow semantics and uc metric view semantics don't perfectly align :/
Why not simply allow to run the SQL definition while you figure out the yml specification with dbt Labs? I.e., add metric_view materialization that will run the CREATE OR REPLACE VIEW ... on top of SQL files of the format
WITH METRICS
LANGUAGE YAML
AS $$
version: 1.1
source: "{{ ref('...') }}"
...
$$
(We did exactly this in order to manage metric views in dbt until officially supported.)
Why not simply allow to run the SQL definition while you figure out the yml specification with dbt Labs? I.e., add
metric_viewmaterialization that will run theCREATE OR REPLACE VIEW ...on top of SQL files of the formatWITH METRICS LANGUAGE YAML AS $$ version: 1.1 source: "{{ ref('...') }}" ... $$(We did exactly this in order to manage metric views in dbt until officially supported.)
Can you elaborate on how you're managing metric views in dbt?
Can you elaborate on how you're managing metric views in dbt?
- Custom materialization — simply took the code from the materialized_view implementation and adjusted as needed
- Metric views in SQL files starting with
WITH METRICS(and usingsource: "{{ ref('...') }}")
That's it
Any news about this @benc-db. They are still debating? 🥲