How USAA migrated from legacy infrastructure to deliver data in real-time

on Jun 03, 2025

The United Services Automobile Association (USAA) is a financial services organization for members of the U.S. military, including veterans and their families. Today, it serves more than 13 million customers, playing a vital role in helping the military community achieve financial security.
As you can imagine, the USAA handles massive amounts of financial data. All of it must be processed accurately, securely, and in compliance with strict regulatory standards. Delays or errors aren’t just bad for the business, they can impact people’s financial well-being and trust in the institution.
Like many organizations, USAA’s legacy infrastructure struggled to keep pace with the ever-growing volume of data. Faced with the need for faster, more scalable systems, the data team turned to dbt. Let’s walk through how they modernized their stack to deliver real-time data and reduce job times.
Outdated legacy infrastructure with performance degradation
To put it simply, the USAA data team was reaching a breaking point with its infrastructure.
Their data pipeline was built around an outdated ETL process that required nightly batch jobs, flat files, and on-prem systems. As data volumes grew, the system couldn’t keep up: jobs took hours to complete, which delayed time to insight and decision making.
“The more data we brought in, the more our infrastructure’s performance declined,” reflects Ted Douglas, Data Engineer at USAA. “It was a heavy I/O process that slowed everything down before we could even begin the work of processing and transforming the data.”
For newly hired developers, the learning curve was steep. Onboarding took months and required intensive tool-specific knowledge. The added complexity and outdated processes made it difficult to attract and retain modern data talent.
Making matters more urgent, USAA’s core database platform was approaching end-of-life and would no longer be supported. It was clear that USAA needed to modernize—and quickly.
Robust CI/CD and real-time data delivery
The modernization started with a foundational shift from ETL to ELT. Next, the data team adopted a cloud-first architecture; for orchestration, they implemented dbt. The result has been dramatic improvements in workflow efficiency.
“After moving to dbt, our job times cut down from hours to minutes,” says Douglas. “By integrating dbt into our pipeline, we’ve streamlined the way we promote and manage code.”
Another advantage of dbt is its administrative APIs, which the team leveraged to build a robust CI/CD platform. Each code push triggers builds, tests, and documentation generation. Artifacts are stored in test repositories, making it easier to audit changes and track lineage.
Significantly, the modernization has enabled real-time data delivery. By integrating with Kafka, the data team now reacts instantly to events and schema changes—pushing updates into production within minutes.
Security at scale: When the data team began using Python models in dbt, they recognized the potential for security risks with third-party packages. To address this, they implemented a process to generate Software Bills of Materials (SBOMs). Now they can scan dependencies for vulnerabilities and license issues. They have real-time visibility into dbt projects, and only approved packages make it to production. Watch their Coalesce session to learn how they did it.
A complete modernization in less than a year
As a result of the transformation, the data team is delivering insights faster than ever. Leadership has noticed; in fact, the team was recognized with USAA’s Pacesetter Award, which honors leadership, innovation, and impact.
“Our modernization took less than a year, and we did it without increasing costs,” says Douglas. “For a company of our scale operating in the highly regulated insurance industry, that’s remarkable."
Most importantly, the transformation has created a ripple effect across the organization. After seeing the impact of strong data practices firsthand, teams are rethinking how they approach data. The data team has set a new standard for identifying and eliminating bottlenecks, and others are keen to follow their lead.
Looking ahead, the data team is focused on realizing more efficiencies to cut costs. They’re well-positioned to make it happen: they’ve got the right tools, and they’re finding the right talent to help them keep their pipelines running faster and more reliably than ever.
“Moving to dbt helped us future-proof our data operations,” concludes Douglas. “We achieved faster time to market, and we did it safely.”
If you’re dealing with legacy systems and growing data demands, we can help you modernize your stack. Get in touch to book a demo, or sign-up for dbt to connect your data warehouse and start building.
Last modified on: Jun 03, 2025
2025 dbt Launch Showcase
Catch our Showcase launch replay to hear from our executives and product leaders about the latest features landing in dbt.
Set your organization up for success. Read the business case guide to accelerate time to value with dbt.