Tempo builds a virtual personal trainer with dbt Cloud and Stemma

This is the story of how Tempo leverages dbt Cloud and Stemma to optimize user experience and e-commerce operations through collaborative data development



improvement from dbt Cloud and Stemma's workflow

Tempo is a rapidly growing smart home fitness company based in San Francisco, California. Offering an AI-powered home gym, the company’s technology uses 3D sensors and AI to track user motions and provide personalized form corrections and custom workout plans based on the data.

Founded in 2015, Tempo’s aim is to democratize the personal trainer experience for customers so they can better enjoy their home fitness experience. Data underpins all of the company’s activities, from user behavior to supply chain metrics.

Powering digital fitness, e-commerce, and operations with data

This data becomes truly valuable when transformed into advanced insights that help power business functions.

“We provide a lot of recommendations in real-time,” explained Chong Sun, Senior Director of Engineering at Tempo. “Let’s say a user is working out—we can say: maybe you should increase your weight or adjust your form to be more productive and effective.”

Data is similarly useful in analyzing Tempo’s supply chain, to help better understand how the business is running and optimize the e-commerce website. 

“In order for our primary marketing channel to be more effective, we need to understand where we should land users on our website and how we can optimize web pages,” Chong explained. “From there, we can also accurately understand how much money we’re spending to attract each user.”

Choosing tools for scale: testing, version control, and lineage

With its former data set-up, however, Tempo struggled to marshal the data it needed to power its various functions.

“With our previous solution, we had a big pain point where the data modeling was not well structured.,” said Chong. “There was no way for us  to maintain it and produce high-quality data with a small team.”

Replacing the old solution, then, was an opportunity to start fresh while adding important new functionality.

“There were a few key components that we were previously lacking,” explained Eric Ganbat, Senior Analytics Engineer at Tempo. “One was a way of testing data to make sure everything coming in is accurate. Second, we previously did not have a good way of doing version control, reviewing each other’s code, and establishing lineage.” With those capabilities in mind, Tempo searched for a new solution.  

While the team was set on using Snowflake and Fivetran in their data tech stack, “We needed something that could clean, test, and transform data in Snowflake,” Eric said.

Tempo’s decision to upgrade its data stack was also prompted by the company’s growth.

“Our previous solution was not fit for us anymore,” Eric added. “We reached a state where we had multiple data engineers and analysts together as a team, and we knew it was time for us to think about how to organize, optimize, and ultimately build better data models. That’s when we discovered dbt Cloud.” 

Modernizing the data stack with dbt Cloud and Stemma: 

In June of 2022, Tempo decided to bring dbt on board. The team already had Stemma in place as part of their goal to democratize data.

“From my past experience at Uber, I knew how important a data catalog is to a data-driven culture,” said Chong. “We needed one place where we can easily identify which tables are servicing other tables. We also wanted to democratize data so everybody could access it, not just the data team, so the catalog was a central effort for us.”

By having their data cataloged in Stemma the team was able to make a smooth upgrade to dbt Cloud. 

“Growing from five to 300 people meant we needed to rethink how we would maintain consistent data, version control, testing, discoverability—all the capabilities that we could scale up with dbt Cloud and Stemma,” said Eric. “Once we set up development and production environments, made sure dbt Cloud was talking to Snowflake, and that credentials were set up correctly, everything was smooth sailing.”

Using a combination of internal training and consultants with additional expertise, Tempo quickly onboarded and ramped up their use of dbt and Stemma. 

“Stemma provided a huge value by helping us clearly understand what’s going on with the current table transformations and set up our migration plan and priorities,” said Chong. “Without Stemma I’m not sure how we could have easily finished the migration.”

Building an enterprise data stack: data unification, testing, and velocity

With dbt Cloud and Stemma in place, the team at Tempo was able to realize a range of new benefits from their new data stack.

Unifying data for improved insights

“Using dbt, we are now transforming all our data models with proper testing and documentation,” said Eric. “We’ve covered a wide range of data sets from different products, generating events through user activity; additionally, we have our financial subscription, product usage data models, as well as marketing and supply chain data.”

With such a large amount of data now at their disposal, Tempo can fully benefit from comprehensive insights drawn from different dashboards spanning product usage, customer profiles, web traffic, and inventory.

“We needed a solution to streamline our data from all sources, and dbt Cloud has truly offered that,” said Eric. “It fits our needs in terms of data collection, understanding customer needs, and generating comprehensive reports and tables for internal stakeholders.”

Catching issues with unit testing

One of the most noticeable benefits of moving to dbt has been the ability to fix pre-existing issues with data, as Frank Wang, Data Engineer at Tempo, explained: “The biggest thing that I noticed moving to dbt is the range of unit testing functionality. A lot of our supply chain data is pretty fragile in terms of the underlying data quality. In the past, stakeholders would reach out to us and we would have to address issues manually.” 

With dbt features like regularly scheduled unit testing, the team now has the ability to simply and automatically react to issues before they reach customers or stakeholders.

“The first week we launched dbt within a portion of the supply chain data, we caught some pretty glaring data quality issues that we could easily fix,” said Frank, “and we were able to resolve them before any stakeholder noticed on the downstream dashboards.”

Velocity: shipping data faster

Thanks to better quality and downstream reporting, Tempo’s data team now has the capacity to respond to the business’ demands faster and more effectively.

“We’re able to work much quicker to respond to ad hoc requests now,” said Frank. “That has been immediately obvious to our product analysts—we’re able to get more done in the same amount of time.”

Analysts at Tempo are now able to work smarter, rather than harder, significantly increasing productivity and decreasing delivery times.

“Now we’re able to change just one table or make one small transformation, and our changes will flow downstream,” added Chong. “I’d say it’s improved our efficiency by about 30 or 40 percent for certain data engineering tasks.”

Collaborative data development with dbt Cloud

Much of the efficiency gain came from dbt Cloud’s built-in workflow which allowed the team to collaborate while leveraging their existing skillset. 

“None of our team members had extensive expertise in command lines. So it’s really helped us to work faster and elevate our team’s capabilities. dbt Cloud is able to quickly compile and run everything we need,” said Eric. 

dbt’s embedded software development best practices introduced code reviews and collaboration to the data team’s previously siloed workflow. “Eric and I have been able to review each other’s code in a way that we just weren’t doing beforehand. I went from not looking at anything to reviewing everything he touches,” explained Frank.

“It’s gone from 0 to 100. Collaboration has been a huge win for us.”

Modeling business-critical data

With months of dbt experience now under their belt, the data team at Tempo has restructured the models providing new levels of insight into the workings of the business. 

“Our orders table is particularly important because it tells us what products we’ve sold,” said Eric. “That’s one of the key financial metrics the finance and marketing teams use, which in turn provide our growth metrics.”

Another foundational model combines all of Tempo’s product insights, collecting information such as who is doing what workouts.

“The unified workout table is a big one,” added Frank. “Whenever we’re doing any sort of product testing or feature development, that’s the table that always gets referenced.”

Managing the data development lifecycle with Stemma

All of Tempo’s tables and dashboards are documented and observable through Stemma. With its strong lineage capabilities, Stemma supports common operational use cases in data engineering.

“Stemma has been a very powerful tool for us to quickly identify work that needs to be done to address new requests. If we need to change a data field, Stemma helps look upstream to find dependencies and trace back to the source so we can make requests of engineering to rev data quickly,” said Chong “Lineage tracking also plays a role in avoiding breaking changes and understanding the downstream impacts of changing columns and tables.”

“I’ve been using a column lineage quite often because column naming wasn’t consistent when we migrated. But with Stemma I can easily see when one column is derived from another column with a different name,” said Eric. “It’s helping me go back and clean up names after the change.” 

Stemma has continued to help the team manage the development lifecycle of their data assets well after the migration. 

“Data table usage is another key feature of Stemma,” said Eric. “We had a case where we knew there were many tables not being frequently used. By looking at lineage and dashboard usage in Stemma, we were able to easily trim down about half of our tables that weren’t being frequently used.” 

Looking to the future: better models and real-time data

After a six-month migration, the team is pleased with the possibilities of their new data infrastructure. 

“We’re very happy with our current data setup,” said Chong. “Overall, we’re confident we have the right tech stack with dbt at the center, combined with Stemma and Snowflake. It satisfies all our current needs.”

Now that the data infrastructure is in place, Tempo’s Q2 plan is focused on optimizing and expanding use cases. With solid foundations in place, they are now looking to extend the benefits to improve data usage across other teams within the organization.

Refining Tempo’s data architecture

The first step towards making that a reality is ensuring they are doing as much as possible with the data Tempo collects.

“The question now is: how can we leverage the current data stack to improve the overall architecture?” said Chong. “We are now able to build much better models. And by building better data models, we can bring more value to our products and to the business.”

Boosting the customer experience through real-time data

Looking further ahead, Tempo is interested in using the power afforded by dbt to improve its real-time data capabilities.

“We’re looking into a number of real-time use cases,” explained Chong. “Once we get the data from our client side, we can rapidly move that information into a data warehouse, another third-party service, or downstream where real-time decisions can be made.”

Tempo is particularly interested in unlocking additional value for its customers: 

“Tempo is not just a smart home gym. It is a data-driven platform that helps its customers reach their fitness potential, enhance their well-being, and have fun while working out. By improving the platform, we can improve the way we use our customer’s data to personalize their training, optimize their performance, motivate their progress, and connect them with others.”