/ /
What is Snowflake Intelligence anyway?

What is Snowflake Intelligence anyway?

Luis Leon

last updated on Nov 04, 2025

The promise of being able to 'talk with your data' is currently in its second decade of being delivered. But we're here now. Instead of it feeling like you were just asking questions using verbal sql, the advent of new tools, LLM and open semantics means you can chat with your data better than you can with 'that guy' at the meat raffle [Editor’s note: This is a very niche reference from the U.S. Midwest contributors to this blog post].

Snowflake Intelligence is a new general-purpose agentic platform from Snowflake. It lets anyone in your organization build and interact with AI agents to ask questions of your data and get instant, accurate answers without writing SQL, building dashboards, or sending tickets to your data teams. It's designed to close the gap between having data and‌ using it.

What sets Snowflake Intelligence apart is that it is built directly into the Snowflake platform on top of proven Snowflake technologies like Cortex Analyst, Cortex Search & Document AI. It understands your data, semantic definitions, and through Cortex Knowledge Extensions can even access and analyze information from public sources like The Associated Press, Stack Overflow, or even information directly from Snowflake’s own documentation.

Leveraging existing Snowflake AI tools also means Snowflake Intelligence can interpret both structured (curated dbt models) and unstructured data (PDFs, emails, Slack conversations). This provides a more complete picture of your business and more accurate answers to your business questions.

Most importantly, and as you would expect from a Snowflake solution, it has all the governance, access, and security built in to work at any scale. Equally exciting, the release of Snowflake Intelligence and the continued refinement and proliferation of AI solutions creates an even larger importance on the role of the analytics engineer and the need for dbt. Let’s get into why.

What role does dbt play in Snowflake Intelligence?

Solid foundations of data quality and governance are critical for success in any AI project. dbt is the standard for data transformation in the cloud data warehouse; it provides reliable, consistent, and well-controlled data that is needed for any AI project to work well at scale. dbt is also a main source for the types of metadata needed to provide the appropriate context for AI to answer questions as accurately and credibly as possible; think data quality, data freshness, and crucially, semantic definitions.

Semantics are becoming increasingly more critical for successful AI. The dbt Semantic Layer makes it possible to define business metrics as part of your dbt pipelines. This allows you to create quality and contextual data. Defining metrics in dbt not only ensures your metrics are consistent and well-governed but are also connected to the contextual metadata of your dbt project.

Snowflake Intelligence uses semantic definitions to generate more accurate responses, while the metadata generated by dbt provides context to allow both the AI and the person asking to understand the reasoning of the response.

Due to this importance of semantic definitions, Snowflake and dbt Labs are excited to be two of the founding members of regardless of the Open Semantic Interchange. While basic functionality exists today—via the ability to define Snowflake Semantic Views with dbt—we are excited to partner more closely to promote the open interoperability of semantic definitions regardless of where these measures are defined or consumed.

The dbt MCP Server and Snowflake Intelligence

Another exciting integration point between dbt and Snowflake Intelligence is the dbt MCP Server, which opens new possibilities for extending Snowflake Intelligence capabilities.

The dbt MCP Server provides a standardized interface that enables AI agents to interact directly with dbt projects, exposing development tools, discovery capabilities, and semantic layer access through the MCP protocol.

Users can ask questions like "Which dbt models reference the customers table?" or “What’s the overall health of my dbt projects and what opportunities are there for improvement?” and get answers directly. They can even execute dbt specific operations like compiling models and running tests or executing pipelines.

What’s even more exciting is pairing this with dbt’s new Fusion engine, and its ability to parse and validate SQL code. With this workflow, Snowflake Intelligence can use the dbt MCP Server to compile dbt projects, the dbt Fusion engine to catch any errors in the code, and Snowflake Cortex to fix these errors—all before any code is ever executed in Snowflake.

While native integration for Snowflake Intelligence and external MCP Servers like dbt’s is on the roadmap, teams can build working prototypes today by creating a Streamlit application within Snowflake that combines the Cortex Agents API (which powers Snowflake Intelligence) with the remote dbt MCP Server. My colleagues at dbt have published this example repo, which is designed to let organizations experiment with the integration pattern, validate use cases, and prove the initial value of adding the dbt MCP Server to Snowflake Intelligence.

Fundamentally, a conversational AI is only as good as the data it queries. If your underlying datasets are inconsistent, poorly documented, or scattered across dozens of schemas with unclear ownership, even the smartest AI will give you unreliable answers. This is where dbt's data control plane becomes critical.

What benefits will you see with Snowflake Intelligence?

With so many new AI tools and agents available, who should use Snowflake Intelligence and what use cases does it help us solve? Snowflake Intelligence is flexible and easy to use. It can help anyone who uses Snowflake, from business leaders and executives who use data to the engineers and analysts who create it.

Business leaders and executives: Business leaders can ask complex questions in plain English—for example, "Show me win rates by region and deal size for Q3." They can get answers, including data, charts, and most importantly, the reasoning behind the answer, all in seconds.

Now let's consider a more complex, real-world example showcasing the power of combining Document AI, Cortex Search & Cortex Analyst. With Snowflake Intelligence, you can analyze an uploaded PDF containing the text of a doctor’s clinical note, parse the text for diagnostic, and treatment details from the patient's visit. Next, Cortex Analyst can query relevant clinical information from the patient's record, for example, the medications prescribed and doses. Combining the analysis of structured and unstructured data means answering complex, valuable, real-world questions.

Data analysts and engineers: Freeing data analysts and engineers from pulling data and answering follow-on questions related to routine requests allows them to shift from being ticket-takers to strategic partners. They can focus on high-value tasks like onboarding new data sets and use cases or tackling the complex analytical challenges that‌ move the business forward. Snowflake Intelligence can find all the data and metadata in Snowflake, including all dbt metadata. This makes it a great way to find data quickly and easily.

Asking questions like ‘what is the best source for certified up-to date customer data?’ leads analysts to the right data products quickly. This data discovery flow increases developer productivity and ensures the data products they build can be deployed faster and with higher accuracy.

Finally, by integrating Snowflake Intelligence with dbt’s MCP Server, analysts and engineers can allow Snowflake Intelligence to take dbt specific actions like assessing dbt project health, pulling dbt specific metadata and finding and remediating issues in pipelines. Even more powerfully, a workflow with Snowflake Intelligence, the dbt MCP Server, and the dbt Fusion Engine can be used to fix pipeline issues before they run against your Snowflake warehouse.

What benefit does dbt provide beyond using Snowflake Intelligence?

Snowflake Intelligence is a great solution for interacting with data inside of Snowflake. But before anyone can meaningfully interact with this data, it must be transformed, validated, and governed. As the standard for data transformation in the warehouse, dbt is the natural tool for ensuring data quality, usefulness, and governance for use by Snowflake Intelligence.

Additionally, the same context provided to the AI is made available to the people consuming the data. This means that people who use these answers can review and check all the same information. This makes Snowflake Intelligence's response not only accurate, but also trustworthy and actionable.

Semantics are becoming more important for AI accuracy. dbt plays three different roles in providing the semantic definitions that make Snowflake Intelligence more accurate:

  1. Teams can define Snowflake Semantic Views directly in dbt, ensuring metric definitions are version-controlled and tested alongside transformations.
  2. The dbt Semantic Layer integrates with Snowflake Intelligence through the dbt MCP Server, allowing conversational queries to leverage the same certified business metrics that power BI dashboards and reports.
  3. Both dbt Labs and Snowflake are founding members of the Open Semantic Interchange (OSI), committing to enhance semantic interoperability industry-wide so metrics remain consistent regardless of where they're defined or consumed.

Integrating Snowflake Intelligence with the dbt MCP Server represents an exciting area of innovation beyond pure analytics. Through the MCP Server, Snowflake Intelligence can answer dbt-specific questions and execute dbt operations like test, run, and compile. When paired with dbt's Fusion engine—which can parse and validate SQL code—this creates opportunities for Snowflake Intelligence Agents to compile dbt code and fix errors before any code executes in your warehouse.

Beyond these specific capabilities, dbt provides the contextual metadata and workflow—data quality metrics, freshness indicators, lineage information, and semantic definitions—that help both the AI and the people asking questions understand the reasoning behind answers and trust the responses. This combination of reliable data foundations, semantic precision, and operational intelligence makes dbt essential for organizations that want Snowflake Intelligence to deliver accurate, trustworthy insights at enterprise scale.

What do you need to make it all work?

Getting started with dbt and Snowflake Intelligence is straightforward, especially if you're already using the dbt platform. All AI projects are data projects. When working with Snowflake Intelligence, you should follow all the best practices for building and scaling a successful data project.

1. Ensure your dbt project is well-documented

Snowflake Intelligence relies on documentation to understand your data semantics. Take the time to:

  • Add meaningful descriptions to your models explaining what business entities they represent
  • Document key columns, especially dimension keys, dates, and metrics
  • Define metrics in your dbt Semantic Layer for consistency
  • Use clear naming conventions that make intent obvious

You don't need perfect documentation to start, but the better documented your models are, the more accurate Snowflake Intelligence's answers will be. If you’re using the dbt platform, dbt Copilot makes documenting your models a breeze.

2. Implement data quality tests

Data quality issues that slip through to Snowflake Intelligence will erode trust in the system:

  • Add schema tests for uniqueness, not null, and relationships
  • Write custom data quality tests for business-specific rules
  • Set up alerts for test failures so issues get addressed quickly
  • Use dbt's store_failures configuration to investigate quality issues

Ensuring that primary key fields are tested for uniqueness and non-nullability makes a tremendous difference in quality. dbt Co-Pilot writes test for you too.

3. Structure your project with dbt Mesh (for larger organizations)

If you have multiple teams building data products, dbt Mesh helps maintain quality at scale:

  • Break your monolithic dbt project into smaller, domain-specific projects
  • Define clear ownership and data contracts between projects
  • Use groups and access controls to manage dependencies
  • Publish stable models for consumption by other teams

This modular structure makes it easier to evolve your data products without breaking downstream dependencies. This is critical when those dependencies include conversational AI queries.

4. Configure Snowflake Intelligence

While Snowflake Intelligence does not require specific set-up, it relies upon other Snowflake services that must be configured before use:

This approach makes getting started quick and easy, while still providing flexibility for highly tailored agents and workflows.

5. Monitor and iterate

Like any AI system, Snowflake Intelligence improves with feedback:

  • Monitor what questions users are asking and what answers they're getting
  • Identify gaps in data coverage or documentation
  • Iterate on your dbt models based on usage patterns
  • Gather user feedback and adjust your data products accordingly

The beauty of this architecture is that improvements to your dbt project immediately make Snowflake Intelligence more valuable. Better documentation leads to more accurate answers. New models expand what questions can be answered. Better data quality increases trust.

Looking ahead with Snowflake Intelligence and dbt

Snowflake Intelligence represents a fundamental shift in how organizations interact with data. But this shift is only possible because of the reliable, well-governed data foundation that tools like dbt provide.

The combination of dbt's transformation and governance capabilities with Snowflake Intelligence's conversational interface creates something powerful: enterprise data that's both trustworthy and accessible to everyone. Your carefully curated dbt models, your tested transformations, your documented metrics—all of these become instantly queryable by anyone in your organization, in plain English, with enterprise-grade security.

We've just seen how this partnership unlocks new value from data investments you've already made. The dbt project you've been building becomes the knowledge base for conversational AI. The governance you've implemented ensures security at scale. The semantic layer you've defined provides metric consistency. And the data quality tests you've written build trust in every answer.

The future of analytics isn't just faster queries or prettier dashboards—it's data that anyone can talk to, learn from, and act on. With dbt and Snowflake Intelligence working together, that future is here.

Ready to get started? Check out the Snowflake Intelligence documentation and work with your Snowflake account team to request preview access.

If you're not yet using dbt, install the dbt VS code extension and start building the reliable data foundation that makes conversational AI possible.

Your data has answers. Now everyone in your organization can ask the questions.

Live virtual event:

Experience the dbt Fusion engine with Tristan Handy and Elias DeFaria on October 28th.

VS Code Extension

The free dbt VS Code extension is the best way to develop locally in dbt.

Share this article
The dbt Community

Join the largest community shaping data

The dbt Community is your gateway to best practices, innovation, and direct collaboration with thousands of data leaders and AI practitioners worldwide. Ask questions, share insights, and build better with the experts.

100,000+active members
50k+teams using dbt weekly
50+Community meetups