/ /
Announcing dbt Agents and the remote dbt MCP Server: Trusted AI for analytics

Announcing dbt Agents and the remote dbt MCP Server: Trusted AI for analytics

Today, at Coalesce 2025, we announced the general availability of the remote dbt MCP server and introduced dbt Agents, a new family of governed, task-specific AI agents built on the dbt platform, now available in beta.

The remote dbt MCP server runs in the cloud and exposes one secure endpoint per environment so AI tools can connect to your dbt project without local setup or custom connectors.

dbt Agents operate inside dbt guardrails. They read the same structured context your team already trusts, so development moves faster, self-service is safer, and data quality goes up.

Together, they make dbt the standard context layer for agentic analytics. Whether you're using AI tools like Claude or Cursor, or you want a more dbt-native experience, we want to enable that choice while moving fast and keeping governance intact.

Get started with the remote or local dbt MCP server today to build your own copilots and agents, or request access to dbt Agents (beta/coming soon) and be sure to check out the Coalesce keynote recap to learn more about our latest launches.

The agent era: The next step for trusted analytics

The world of analytics is shifting. Data leaders are already rolling out AI that speeds how teams build, manage, and consume data, and many deployments are live, not experimental.

As those efforts scale from pilots to production, one pattern is clear: agents are only as good as the context they run on. Without shared context, an agent guesses. It may pick the wrong joins, apply the wrong filters or time grains, or use out-of-date logic.

With shared context, an agent knows. It can read definitions for metrics and dimensions, follow lineage to see dependencies, check tests and freshness, respect policies and roles, and explain exactly what it did.

That's where dbt comes in. dbt has long provided data teams this structured context layer (definitions, lineage, tests, and semantics) that codifies how data should behave.

That foundation now unlocks the next step: an agent-first era where AI plans work, executes tasks end to end, and checks results against the same definitions your team already trusts.

As Øyvind Eraker, Senior Data Engineer at NBIM put it, “Structured context is the multiplier. With dbt as our source of definitions and lineage and MCP exposing that context across Snowflake and Claude, we can add new agent skills without re-plumbing governance.”

Today, we’re taking that step with the launch of dbt Agents, powered by structured context and made accessible through the remote dbt MCP server. Together, they redefine how you build, manage, and consume data, so development speeds up, self-service expands, and data quality improves.

"In five years, your most reliable data developer will be an agent that commits code, passes tests, and explains its work. It will do that because it stands on dbt. With agents running on the dbt structured context layer and exposed through the remote dbt MCP server, this is how analytics will be built, managed, and consumed in this new era.” - Tristan Handy, CEO, dbt Labs

A standards-based foundation for AI: the remote dbt MCP server is GA

Powering this launch is the structured context that already lives in dbt: the definitions, lineage, tests, and semantics that teams already use to make analytics trustworthy.

This structured context is now universally accessible to AI systems through our version 1.0 remote and local dbt MCP Server. It turns dbt’s structured context into a standard interface that AI tools can safely call, across your warehouse.

This means your governed models, metrics, tests, and lineage are available to clients like OpenAI, Anthropic, and Cursor for safer, more reliable AI.

We are also bringing the dbt Fusion engine to MCP through our new Fusion MCP tools, so clients can use Fusion’s compiler, diagnostics, and metadata, giving agents precise awareness of how transformations work and what might break before they act. Fusion enables analytics agents to be reliable and accurate.

And because strong execution only matters if it is secure, we are expanding authentication and control. The local dbt MCP server now supports OAuth, so teams can use their dbt login for secure access with a simple setup. OAuth for the remote dbt MCP server is on the way, bringing centralized authentication and auditability to your single endpoint per environment.

With the MCP server standardizing access to the structured context layer that lives in dbt, and an expanding suite of MCP-native tools, you get a universal, governed bridge between AI and trustworthy data.

Why dbt is the natural standard for agentic analytics

In analytics, trust is critical. Without a structured context layer, AI tools can make mistakes that damage confidence. dbt solves this problem by providing clear rules for how data should be organized, tested, and used.

We provide this context layer through several key tools: the dbt MCP server creates one secure gateway for AI to access properly structured context in any warehouse; dbt Fusion delivers fast data mapping, code checking, and validation to ensure work is reliable; and MetricFlow (now open source) ensures consistent measurements across all your tools.

We're committed to continuing to strengthen this foundation and make it even more AI-friendly over time.

And the momentum is real. More than 900 data teams and partners have adopted the dbt MCP server to prototype conversational analytics and agentic workflows across their stacks.

“Chat with your data’ works reliably when every answer is governed and explainable. dbt is our governance backbone, and MCP exposes that structured context to our chat experience so any NBIM employee can ask questions and get trusted answers. With dbt, Claude, and Snowflake powering our chat experience, adoption is 10× our previous catalog and tickets to core data teams are down. The same governed foundation now powers agentic workflows that flag anomalies, deliver morning briefs, and open small PRs.” — Øyvind Eraker, Senior Data Engineer, NBIM

Because the remote server exposes your project through one endpoint, teams are moving from chat into agents that run end-to-end tasks across the analytics development lifecycle. They are deploying changes faster, rethinking BI with governed answers, and using the warehouse as durable memory for longer, multi-step agent work.

With the dbt MCP server as the standard for structured data in AI, you can start building agents today. Simply connect the remote server, plug in your preferred model, and ship agents that deliver faster development, safer changes, and higher quality.

But we want to make autonomous, AI-powered analytics truly seamless for data teams.

Introducing native agents to dbt

That’s why, on this foundation, we’re introducing dbt Agents in the dbt platform. These out-of-the-box agents help you build, manage, and consume data faster within dbt guardrails. As routine work shifts to agents, analytics engineers can focus on tuning agent behavior to accelerate their workflows and unlock safe self-service for the business. These agents use the same structured context layer your team already trusts so they act confidently without sacrificing governance.

Today we’re introducing the following agents:

  • Analyst agent (available in beta): Available inside dbt Insights (GA) - ask questions in plain English and get governed answers. The agent uses your dbt project context to generate SQL directly from your models, then executes it in your warehouse and returns verified results with definitions, tests, and lineage. When metrics are defined in the dbt Semantic Layer, the agent automatically resolves them for higher accuracy. Otherwise, it reasons from your dbt context to generate the right query.
Analyst agent
  • Discovery agent (in beta). Find the right dataset or metric in plain language, along with clear definitions and why it’s trustworthy. Surfaces governed sources and dbt lineage so anyone can explore data with confidence.
  • Observability agent (coming soon). Helps you monitor jobs, pinpoint likely root causes, and cut resolution time. Designed to reduce noise and cut investigation time dramatically.
  • Developer agent (coming soon). Explains model logic, predicts downstream impact, flags duplicate logic, and validates changes before merge. Runs directly in VS Code or dbt Studio, powered by dbt’s context, so every change can be shipped quickly and safely**.**
add alt text

“We are excited for dbt Agents to bring purpose-built automation that moves us from reactive tickets to proactive, agent-driven operations and spares us the overhead of bespoke bots.” - Øyvind Eraker, Senior Data Engineer, NBIM

The momentum does not stop here. We will keep expanding agent capabilities, integrations, and multi-agent workflows so more of your analytics development lifecycle can run with confidence on the same governed foundation.

What’s next

The first generation of dbt Agents turns structured context into action, helping teams ship faster, catch issues earlier, and expand safe self-service. But this is only the start. These agents act as powerful force multipliers for data teams, automating routine tasks, enhancing collaboration, and allowing data engineers to focus on higher-value strategic work instead of repetitive operations.

As these agents mature, they will fundamentally transform how data engineering teams operate. Rather than handling every data request, engineers will orchestrate and govern agent-driven workflows that scale their expertise across the organization. This means smaller teams can support larger data ecosystems while maintaining quality and governance, truly democratizing data without sacrificing reliability.

In the future, we'll continue to deliver experiences that accelerate more stages of the analytics lifecycle and introduce multi-agent workflows that help you work faster. We'll keep deepening dbt's structured context layer through richer metadata intelligence in dbt Fusion, so agents become more aware of how data behaves across production systems. And we'll continue expanding through open standards - MCP, MetricFlow, metadata to ensure dbt integrates cleanly with the broader AI ecosystem.

We envision a world where data engineers spend more time on innovation and less on maintenance, as agents handle increasingly complex tasks with minimal supervision.

Start today by enabling the remote dbt MCP Server, try the Fusion MCP tool, and request access to dbt Agents.

This is the beginning of the agentic era for analytics and we can't wait to see what you build with us!

Published on: Oct 14, 2025

Rewrite the rules. Redefine what’s possible.

Join the premier conference where data leaders shape the future of data & AI. Stream Coalesce Online FREE next week.

Set your organization up for success. Read the business case guide to accelerate time to value with dbt.

Read now

VS Code Extension

The free dbt VS Code extension is the best way to develop locally in dbt.

Share this article
The dbt Community

Join the largest community shaping data

The dbt Community is your gateway to best practices, innovation, and direct collaboration with thousands of data leaders and AI practitioners worldwide. Ask questions, share insights, and build better with the experts.

100,000+active members
50k+teams using dbt weekly
50+Community meetups