/ /
Talk to your data: AI-powered conversational analytics with the dbt MCP server

Talk to your data: AI-powered conversational analytics with the dbt MCP server

Kathryn Chubb

last updated on Nov 18, 2025

The promise of conversational analytics has captivated data leaders for years. Imagine business users simply asking questions in natural language and receiving accurate, governed insights instantly.

Yet most organizations struggle to move beyond proof-of-concepts. That’s often because of a lack of structured context. AI agents need to know not only what data exists, but how it's connected, what it means, and how it should be used. That spells the difference between an AI that's guessing and one that's giving accurate, trustworthy results.

Norlys, Denmark's largest integrated energy and telecommunications group, faced this challenge on an unprecedented scale. Following a major acquisition and organizational split, the company needed to rebuild its entire analytics infrastructure from scratch.

Rather than simply migrating legacy systems, the data team saw an opportunity. Leveraging the dbt Model Context Protocol (MCP) Server, they reimagined how the organization interacts with data, putting conversational analytics at the center of their vision.

From 500 separate apps to one data platform

Norlys serves between five and six million people in Denmark, with 800,000 customer-owners in a co-op structure. The company delivers energy, charging stations, internet, television, and mobile services while owning critical infrastructure, including Denmark's largest fiber network.

When Norlys acquired the Danish operations of Telia Mobil in 2024, the merger brought together two nearly equal-sized organizations. Each had mature but fundamentally different data systems.

The acquisition created both massive complexity and a unique opportunity. Telia Denmark was deeply integrated with approximately 500 applications shared across the group. Disentangling this was a complex project in its own right.

Simultaneously, the Danish competition authorities mandated that Norlys separate its infrastructure businesses into distinct companies: one for its electricity business and one for its fiber-optic network. This forced additional data architecture changes across the organization.

The scale of technical debt was substantial. Legacy systems had accumulated through 40 mergers over 10 years, leaving data scattered across multiple business intelligence platforms with inconsistent definitions. When different departments were asked simple questions like "how many customers do we have," the answers rarely aligned.

The company had over 500 existing BI solutions that reflected outdated business structures and siloed thinking. Even finding all of these solutions was challenging - when they thought they had them all, more would pop up.

Rather than settle for this, company management decided that what it needed was one united workforce, working from one data platform and one single source of truth. They didn’t want 500 reports that obscured data, but a single location where users could ask data-driven questions and get accurate answers back.

Building the foundation: Metrics first

Fulfilling this bold vision meant starting fresh with a metrics-first foundation designed for modern analytics, including AI-powered conversational interfaces. The metrics-first approach centers on defining canonical business metrics once, in a centralized location, rather than scattering logic across transformation pipelines, BI tools, and application layers.

Using dbt's Semantic Layer, Norlys began systematically defining core metrics spanning strategic, tactical, and operational needs. dbt is a data control plane that works across various cloud and data platform environments, providing a flexible, collaborative, and trustworthy environment for accessing data, no matter where in the organization it lives.

This meant establishing fundamental definitions, such as what constitutes a customer, how orders are measured, and how financial metrics are calculated. These definitions required extensive collaboration, bringing together energy experts, telecommunications specialists, and domain leaders to agree on common standards.

The metrics are declared in YAML files within dbt, creating a single source of truth with several key advantages:

  • There’s one version of “the truth” and only one place to update logic when business rules change.
  • Complete data lineage becomes immediately visible, showing exactly which source systems and transformations feed each metric.
  • The approach eliminates dependencies on specific BI platforms, allowing the organization to adapt as technology evolves without rebuilding business logic.

Building this foundation required significant upfront investment. The team onboarded over 100 data sources. Recognizing that they were a different business now, they carefully staged and documented each one rather than rushing to recreate legacy reports.

The process took months. Ultimately, that created the structured context necessary for reliable AI applications.

Technical implementation and partnership

To implement this, Norlys selected Snowflake and dbt as the core of their modern data platform, complemented by Apache Airflow, Qwik Talend, and the lightweight Python package dlt for orchestration and ingestion. This consolidated stack replaced numerous legacy platforms, giving the organization a focused foundation for scaling data operations.

Implementing dbt at this scale required expertise that the team was building internally. Norlys partnered with Leap, a specialized AI and data consulting firm with deep dbt experience, to accelerate the journey. This collaboration proved essential for establishing best practices from day one, creating blueprints that enabled 60+ data engineers to develop consistently.

The blueprints cover everything from code style to testing standards to documentation requirements. While this might seem rigid, the structure enables sustainability as team members change.

dbt gave all data engineers and data analysts a consistent way of working together. The team set down common guidelines and best practices, which included creating data tests and thorough documentation for every new data model.

With this approach, any engineer can pick up work someone else started because the patterns are predictable and well-documented. It also freed the team from endless debates about implementation details, allowing them to focus on solving business problems.

With that many people involved, the team feared over-generalization of roles. If everyone had a responsibility, that meant no one had it. So they created new titles and positions with clearly defined responsibilities.

The partnership helped Norlys avoid common pitfalls in large-scale dbt implementations. Rather than discovering organizational patterns through trial and error over months, the team established effective project structures immediately. This meant that when they were ready to build the conversational analytics proof-of-concept, the underlying Semantic Layer was production-ready.

From metrics to conversations with the dbt MCP server

Once the Semantic Layer contained well-defined metrics with rich metadata, the path to conversational analytics became clear.

The dbt MCP Server provides a standardized way for AI agents to access this structured information. With MCP Server, AI systems can understand not just what data exists but how it connects, what it means, and how it should be used.

Norlys developed a proof-of-concept called Orion to demonstrate the concept, focusing initially on the finance domain where metrics were most mature. The implementation leveraged Claude Desktop with the dbt MCP server, connecting directly to the company's defined financial metrics in dbt Cloud.

The architecture is intentionally modular. While the proof-of-concept used Claude, the team designed the system to swap components as technology evolves.

The dbt MCP server acts as an intermediary layer, allowing large language models to query metadata through APIs while maintaining governance and access controls. Users with rights to specific data domains can interact with those metrics through the conversational interface, while sensitive information remains protected.

The experience differs fundamentally from traditional dashboards. When a business user asks a question, the AI agent understands the request, identifies relevant metrics from the Semantic Layer, generates appropriate queries, and returns interactive visualizations.

Critically, the user can then drill deeper with follow-up questions in the same natural conversation. They can explore different angles without waiting for dashboard modifications or developer queue times.

Demonstrating value quickly

The finance domain proof-of-concept took just two to three days to build once the metrics were defined.

This rapid implementation was possible because the hard work had been done upfront: staging data sources, defining metrics, writing documentation, and establishing governance. Adding the conversational layer on top required minimal additional effort.

The demonstration showed business users asking questions in natural language and receiving accurate, interactive charts within seconds. Behind the scenes, the AI agent interpreted the request using the rich metadata in the Semantic Layer, identified relevant metrics, and generated visualizations. The system also provided transparency, allowing users to follow the agent's workflow and understand how it arrived at each answer.

This initial success validated the metrics-first strategy and opened conversations about broader applications. The modular architecture means additional use cases can be added incrementally, each leveraging the same foundational Semantic Layer while potentially using different AI models or interfaces based on specific needs.

Scaling the vision

Norlys envisions conversational analytics not as a replacement for all traditional BI. Rather, it’s a complementary capability that addresses different use cases.

Dashboards remain valuable for standardized reporting and monitoring. Conversational interfaces excel when users need to explore data, ask follow-up questions, or investigate anomalies without waiting for dashboard development cycles.

The longer-term vision extends beyond asking questions about existing data. The team sees the conversational interface evolving into a "personal mission control console" where users not only gain insights but take action. For example, if an analysis reveals customers with missing contact information, an MCP-connected system could automatically trigger outreach campaigns to collect the needed data.

This requires additional MCP servers beyond dbt, each exposing different capabilities while maintaining consistent governance. The modular approach allows Norlys to experiment with new AI models and tools as they emerge, swapping components without rebuilding the entire system.

Lessons for other organizations

The Norlys journey offers several lessons for data teams considering similar transformations.

First, the metrics-first approach requires significant upfront investment but pays dividends across all analytics use cases, not just conversational AI. Having canonical metric definitions improves traditional BI, self-service analytics, and operational reporting alongside AI applications.

Second, starting small accelerates learning and builds organizational confidence. Rather than attempting to migrate 500 legacy reports, Norlys selected one well-understood domain, defined those metrics carefully, and demonstrated value quickly. The success created momentum for expanding to additional domains.

Third, data quality matters more than ever with AI applications. When a CFO asks about revenue through a conversational interface and receives incorrect numbers, the credibility damage is severe. The structured context from dbt's Semantic Layer helps ensure accuracy, but the underlying data must be sound. Norlys invested heavily in data staging, testing, and documentation to build that foundation.

Fourth, domain expertise must be centralized and formalized. The days of scattered BI teams building siloed solutions are ending. Norlys brought together experts from energy, telecommunications, and other domains into a unified data and AI organization. These experts collaborated on metric definitions, ensuring consistency across the business.

Focusing on domains also enables setting up your governance and security structure. For example, you can ensure that employees in HR have access to employee data, but that others outside of the department can’t see (for example) the salaries of their co-workers.

Finally, partnering with experienced practitioners accelerates time to value. While building internal dbt expertise is essential, working with consultants who have implemented similar solutions at scale helps avoid common mistakes and establishes best practices from day one.

The path forward

Norlys continues building on this foundation, expanding metric coverage across additional domains and exploring new use cases for conversational analytics. The data team sees particular potential in customer operations, where AI-powered insights and actions could optimize processes at scale.

The organization's approach demonstrates how to build an effective Analytics Development Lifecycle and data control plane in practice. By centralizing business logic in dbt's Semantic Layer and exposing it through standards like the MCP server, Norlys built a flexible foundation that supports both current needs and future innovation.

As AI capabilities evolve rapidly, this structured approach provides stability. The metrics, lineage, and documentation in dbt remain consistent even as the organization experiments with new large language models, interfaces, or AI agents. The investment in structured context pays returns across an expanding portfolio of AI applications, all grounded in the same trusted foundation.

For organizations overwhelmed by the pace of AI innovation, the Norlys story offers a pragmatic path. Focus first on building structured, well-governed data foundations using proven tools like dbt Cloud. Once that groundwork is solid, AI applications become faster to build, more reliable in production, and easier to expand over time.

The future of analytics may be conversational, but it's built on a foundation of carefully defined metrics and trusted data.

Live virtual event:

Experience the dbt Fusion engine with Tristan Handy and Elias DeFaria.

VS Code Extension

The free dbt VS Code extension is the best way to develop locally in dbt.

Share this article
The dbt Community

Join the largest community shaping data

The dbt Community is your gateway to best practices, innovation, and direct collaboration with thousands of data leaders and AI practitioners worldwide. Ask questions, share insights, and build better with the experts.

100,000+active members
50k+teams using dbt weekly
50+Community meetups