Modular data modeling
Transform raw data into human-usable metrics.
Why modular
How you transform data has a huge impact on the happiness and productivity of your team. Have you ever attempted to debug a long stored procedure that someone else wrote?
With modular, SQL-first data modeling, anyone on the data team can make sense of your team's work, and build on it when the time comes.
Modularity
Leverage for data teams
dbt is a framework for writing high-leverage SQL transformation code.
Dependency management
Layer your models with the ref() function, and dbt automatically determines lineage.
Clear project structure
Express your data warehouse design in terms of sources, staging models and marts.
Macros and packages
Develop analytic code that writes itself, to avoid repeating frequently-used statements (ex dbt_utils).
Quality-of-life considerations
dbt makes easy things easy, and hard things possible.
Materializations
Define materializations in code
Set your materialization logic inline with your transformations.
dbt supports materializing models as tables, incremental tables, views, or a custom materialization of your design.
Incrementality
Handle large datasets gracefully
Long-running queries can slow development and drive up your cloud data warehouse bill.
dbt's support for incremental models allows you to limit the amount of data processed — improving performance and reducing compute costs.
Snapshots
Track slowly-changing dimensions
dbt's snapshots record changes to a mutable table over time, and can allow you to more easily "look back" at previous data states.
Model data where it lives
Whether your analytics data is stored in a cloud warehouse, data lake, lake house or beach house - you can model and transform it with dbt.
For data models of any shape
Whether your team prefers your data tall or short, narrow or wide, your data transformation tool should support your efforts.