Maximize the business value of your data platform with dbt
last updated on Feb 11, 2026
Every data leader today faces a familiar paradox: as technology becomes more efficient, costs don't typically fall - they often rise. This phenomenon, known as Jevons Paradox, explains why fuel-efficient cars don't reduce driving costs (we just drive more), and why lower BI costs don’t shrink analytics costs (we just create more dashboards).
We observe this pattern at dbt Labs, too. Every day across our platform, teams run nearly a million jobs that power analytics products and AI workflows, with the majority running on an hourly cadence. But most of the time, the models supporting these runs haven't changed. That means users are rebuilding thousands of models that look exactly the same the second time they run. At scale, this is inefficient, costly, and not optimized to drive the best possible outcome.
We need to think about data work, not in terms of model runs or data products produced, but in terms of business value delivered and return on investment generated.
Understanding the ROI equation
ROI is a fairly simple equation: value divided by cost. Every decision we make about data - tools, people, pipelines - ultimately rolls up to this equation.

There are three levers we can pull to improve ROI.
The first is obvious: increase the value our work delivers. That could mean faster insights, higher adoption, or more trusted data - anything that grows the value numerator.
The second lever is cost. You might think the goal is to reduce or cut costs, but in many cases, you just can't. You're already paying for tools. So, on the cost measure, what you want to do is make your use more efficient. Increase efficiency, reduce rework, and drive more adoption of the tools you already paid for. That's how you optimize the denominator.
The third lever is improving how we measure ROI itself. If we can't accurately define value, cost, and ROI, we really can't effectively measure ROI. Everything we do should fall into one of these three buckets.
Building trust through quality
When we talk about value, we often start with what our teams make: models, dashboards, data products. But none of that matters unless it creates something the business can actually feel.
Value goes up when a decision gets made faster, when a forecast is more accurate, or when a customer experience improves. All of this is based on trust - trust in decision-making tools, in forecasts, and trust from users.
The challenge is not to build more - it's to make what we build matter more by building trust.
Before you can establish trust, you need to consistently deliver quality. The way to prevent data quality issues is the same way software engineers prevent bugs before they reach production: through a well-defined lifecycle, such as the Analytics Development Lifecycle.
With dbt, it starts with testing. Every dbt model, every transformation is tested automatically, so you catch broken logic before it hits the business. Continuous integration takes it a step further - every change is validated in isolation before it hits production. That gives teams confidence that what they deploy is clean and stable.
Documentation and data lineage make that trust visible. People can see where the data came from, what changed, and who changed it. They stop guessing and can trace every number back to its source.
Peer review via pull requests creates shared accountability. When someone has looked at your work, asked questions, and approved it before it goes live, you can feel good about not just the models, but the downstream data products those models create.
Even when we build great systems, things go wrong. The difference between a trusted data team and an untrusted one is how we communicate around those issues.
In software, users expect visibility. When an app has an outage or service is degraded, you get a status page, not silence. That transparency builds credibility, even when the system is down.
We should take the same approach with data. If something breaks, we need to make it visible to the people who consume data. Don't bury it in a ticket - embed it as a trust signal that says this data is healthy, or this dashboard has problems.
This is where health tiles built into tools like Tableau become essential. When users know we're watching and we're honest about issues, they trust us more. Silence erodes confidence, but visibility builds it.
Speed as a value driver
Another side of the value equation is speed. With the new dbt Fusion engine, we've completely changed the game for developer experience. Native SQL comprehension provides real-time feedback whether you're writing in VS Code, dbt Studio, or dbt Canvas. You stay in flow with Intellisense, real-time error detection, and hover previews.
Beyond the developer experience, there's pure speed. Fusion speeds parse time by 30x or more compared to dbt Core. As you're building trusted data, you're also working much, much faster.
Optimizing costs through intelligent orchestration
Think about how a warehouse processes data - it understands a single SQL statement at a time. It's efficient within that boundary, but it has no concept of the larger picture.
Fusion broadens that understanding. It knows how queries connect across your data models and teams. That means it can avoid running redundant transformations, skip unchanged work, and orchestrate intelligently.
State-aware orchestration is a concrete example. Today, when you run dbt build, everything selected will be built, even if the models haven't changed. With state-aware orchestration powered by Fusion, we know exactly what code or data has changed, and Fusion only rebuilds what's needed. Rather than rebuilding an entire DAG, we only build the models that would actually change or need to be rerun.
The efficiencies extend beyond orchestration to testing. Today in dbt, we test everything at the end of every model build. With efficient testing, we only run tests that are needed. If you have a unique test that already passes upstream, and there isn't any logic or code introduced that invalidates it, we'll just reuse that same test. This speeds up run times and optimizes dbt build costs by avoiding unnecessary computation.
The results are significant. We're seeing roughly 10%+ of compute saved with state-aware orchestration, another 15%+ with tuned configurations, and another 4% more with advanced configurations - upwards of 29% saved on compute with state-aware orchestration.
Stop looking at vanity metrics!
Are we measuring the right things?
In many cases, the answer is probably no.
It's easy to measure new dashboards built, number of tables created, or how much data we're working with. These are fun numbers to play around with, but they add no real value.
These numbers don't tell you whether your business is improving, getting faster, getting smarter, or making better decisions. They don't influence the value equation. They're just easy to measure. They're called vanity metrics.
Let's resolve to stop looking at vanity metrics and try something else. How about:
- Time to onboard - how many hours between a new hire and their first commit?
- Time to insight - how long from first commit to production-ready?
- Optimizing compute - how are we using models more efficiently with our runs and tests?
These are examples of how we can optimize our measurements and help our customers optimize their ROI in the process.
A framework for quantifying business value
At dbt, we're focused on helping customers use data more effectively to support critical initiatives. This means partnering with customer leadership and practitioners to align on business initiatives and validate the financial benefit expected from dbt.
The work involves identifying business priorities, then documenting how dbt saves time and money and helps achieve those priorities. The final output is a measurable business case that quantifies return on investment based on cost and time savings, while highlighting how dbt supports the business.
Every customer leader talks about embedding reliable data-driven decision-making across their business. At a high level, these efforts boil down to three common themes.
- Supporting revenue.
- Enabling efficiency and cost savings.
- Building and modernizing technology.
Building the business case
When building a value case, there's a consistent framework that follows how customers evaluate dbt.
First is tooling and maintenance. What's the cost of the dbt platform versus the current tooling? For self-hosted customers, we quantify the total cost of ownership for maintenance and scale. For native warehouses, it’s time saved on stored procs and notebooks: time spent using your tool rather than maintaining it.
Next is warehouse optimization: maximizing warehouse value through improved governance and eliminating redundant model builds with features like state-aware orchestration.
Finally is operational time savings - improving collaboration, accelerating pipeline builds, and improving data quality. This frees up time to be more productive and focus more hours on business impact.
All these cost savings and freed-up hours are allocated toward broader business initiatives: driving revenue, supporting cost savings, and modernizing tech stacks to reduce risk.

Three habits to start today
Considering your own business initiatives, here are habits to get the most ROI from your dbt investment.
- Bring engineering discipline to analytics. Version control, CI/CD, testing, and ownership - this is the foundation of value. If you're already a dbt user, you're doing these things and probably crushing this discipline.
- Optimize your costs. Lineage, orchestration, and remediation make your spend predictable. Limiting or eliminating model rebuilds, unnecessary rework, and questions from confused teammates will save costs and headaches.
- Make ROI visible to leadership. Think about how to connect data analytics and AI investments to outcomes. Speak in the language of the business by focusing on the measurables that matter.
When you do this, you're not just managing a data platform - you're delivering quantifiable business value that transforms how your organization operates and builds trust.
VS Code Extension
The free dbt VS Code extension is the best way to develop locally in dbt.





