/ /
How Obie cut compute costs by 30%, reclaimed engineering hours, and built stronger governance

How Obie cut compute costs by 30%, reclaimed engineering hours, and built stronger governance

Elaine Green,Aika Zikibayeva

last updated on Apr 24, 2026

At Obie, an embedded insurance platform serving real estate investors, the data reality looked a lot like a typical fast-growing startup. With a lean footprint of just 130 employees and a recent, high-stakes acquisition, the pressure on the company's data platform was immense. Behind the scenes, infrastructure costs were climbing and starting to create friction across the business.

The team adopted the dbt Fusion engine as their transformation engine and deployed state-aware orchestration (SAO) to remove operational drag and ensure confidence in the numbers the business depended on.

When the team started, Obie's data layer was built on a Data Vault 2.0 methodology. This is a pattern designed for large enterprises. For a smaller fast-moving insurance tech company, it slows development and results in overhead that doesn’t match a smaller tech company’s size or pace. Obie made the call to re-architect entirely on Fusion, converting legacy models while preserving the underlying business logic. That migration is now over 90% complete.

SAO reduces warehouse costs by ~30% and makes more frequent refreshes possible

Previously, large portions of the pipeline were rebuilt whether upstream data had changed or not. As data volumes increased, that approach became expensive.

Moving to SAO on Fusion meant running only what was necessary. Compute usage dropped, and warehouse spend became more predictable. The team could refresh data more frequently, from daily to every two hours, without worrying about runaway costs.

"We're saving at least 30% on compute costs, just from reusing models with state-aware orchestration,” said Tyson Doberneck, senior data engineer.

Matt Karan, senior data engineer, added, “Knowing we're being proactive about costs gives leadership more confidence that we can have our data volume grow and still operate in a lean, young-company environment.”

Fusion's orchestration and CI workflows recover up to 5 engineering hours per week

With Fusion's built-in orchestration, version-controlled models in GitHub, and CI-driven staging environments that let engineers compare production vs. staging data before merging, pipeline interruptions decreased. Engineering time was freed to focus on analytics and product-facing work.

Consistent metric definitions ensure consistent data

The data team implemented best practices of data transformation: consolidating models, adding testing, and documenting definitions all within their Fusion project, the business can move faster because of consistent data.

ChallengeChange implementedOutcome

Data Vault 2.0 methodology slowing development

Re-architected on Fusion with streamlined modeling patterns

90%+ migration complete, significantly faster development cycles

Rising warehouse costs

State-aware orchestration and more efficient model execution

~30% lower compute usage and improved cost predictability

Engineering capacity focused on manual fixes

Centralized, standardized transformation workflows

2–5 engineering hours/week reclaimed

Metric inconsistencies

Shared, version-controlled models with testing and documentation

Consistent reporting

Multiple pipelines

Single transformation layer managed in dbt

Clear process and easier maintenance

Scaling organization preparing for integration after acquisition

Governed, observable data platform

Greater readiness for continued growth

"Everything about Fusion has sped up my workflows,” said Karan. “I feel like it's just going to keep going in that direction. Eventually, I'll never have to leave my coding environment and be able to work with all the data in one pane of glass."

What's next: Building the "Middleware" for agentic workflows and self-service analytics with dbt Semantic Layer

Data development now feels noticeably smoother, and engineers can now focus on delivering business value. Next on the roadmap: an internal Slack bot that queries dbt's Semantic Layer to answer business questions on the fly. Because the bot references governed metric definitions rather than querying the database directly, the risk of hallucination drops significantly. The team is also evaluating dbt Mesh and the dbt MCP server as part of a broader push toward self-service analytics across Obie's 130-person organization.

Doberneck concluded, “The dbt Fusion engine is a non-negotiable for me. With anything else in our stack, we could make a change. I would never switch out dbt.”

VS Code Extension

The free dbt VS Code extension is the best way to develop locally in dbt.

Share this article
The dbt Community

Join the largest community shaping data

The dbt Community is your gateway to best practices, innovation, and direct collaboration with thousands of data leaders and AI practitioners worldwide. Ask questions, share insights, and build better with the experts.

100,000+active members
50k+teams using dbt weekly
50+Community meetups