Aktify democratizes data access with Databricks Lakehouse Platform and dbt

This is the story of how Aktify uses Databricks and dbt to eliminate manual tasks and errors from its data transformations.

80%

Reduction

in data engineering hours

95%

Reduction

in time to onboard new employees

6

Figure annual savings

in IT headcount costs

Aktify aims to help its clients convert their customers through conversational AI. Clients can use a conversational AI agent to conduct thousands of SMS conversations with sales prospects at the same time. With Aktify’s solutions, the average customer generates $227,000 in revenue per year and achieves a 19x ROI. To fine-tune its AI agents to client needs, Aktify must be able to drill into massive volumes of data and find overlooked insights. Using Databricks Lakehouse Platform and dbt, Aktify has taken the manual effort and risk out of its data transformations, democratizing data throughout its organization. Stakeholders are making better-informed decisions—and Aktify is saving six figures per year on its IT operational costs.

Complex data dependencies prevent data democratization

Aktify’s customers use conversational AI agents to do the work of thousands of live customer service agents. Behind the scenes, Aktify works to make these AI agents as effective as possible. The insights Aktify and its customers need are hidden in the company’s massive data volumes. 

“A typical use case for our customers is to look at the data points in our software and wonder when and why their customers and prospects are dropping off their chat funnel,” said Brandon Smith, Director of Data and Analytics, Aktify. “My team and I help them figure out how they can best move leads from initial engagement to positive engagement, which is when they’re responding to texts and want to schedule a call with their customer’s call center. ”

To deliver this level of customer service, Aktify seeks to help data teams interact with data how and where they need it. A data scientist may need raw data, whereas an executive team would want all their data to be pre-aggregated so they can quickly find answers to their questions. Self-service is great, but in practice, you need the right tools to prevent bottlenecks. 

“I knew if we used SQL Server Integration Services (SSIS), everyone would be afraid to work with data in production because it would probably break something downstream,” Smith explained. “We would probably have people complaining that we broke their favorite dashboard—and fixing these kinds of problems can be difficult and time-consuming. We didn’t want complex dependencies at Aktify. We wanted to ‘democratize’ our data.”

Taking the risk out of data transformations

Seeking to minimize complexity around its data, Aktify implemented Databricks Lakehouse to make its data management more simple, flexible, and cost-effective. The company also began using dbt to make data transformations easier and more reliable. Right away, Smith and his team felt more comfortable letting employees with minimal SQL skills work directly with data. Databricks makes data in production accessible to anyone on Aktify’s staff who might need to tap into it, while dbt provides testing and documentation that prevent downstream disasters. 

“Data transformations are no longer a scary thing for Aktify,” Smith said. “From day one, we can let new employees loose on our data without wrapping a lot of red tape around their hands. With Databricks and dbt, we’ve been able to onboard employees into working with our data systems in half a day, compared to two weeks previously, and if they have some SQL experience, they’re usually comfortable with the systems in about three days. This wasn’t possible with any other system I’ve used in my career.”

With Delta Lake on Databricks, Aktify ensures it will get reliable performance. The time travel functionality helps Aktify streamline troubleshooting so it can roll out new functionality releases more quickly.

“With everything I have going on in Databricks and dbt, the time travel features in Delta Lake enable me to stop after making changes and see what actually changed,” Smith explained. “If anomalies surface, I can go back and spot when I introduced the issue and why. That makes fixing the problem so much faster and easier.” 

Building data pipelines in half a day

With Databricks and dbt, Aktify now finds the insights its customers need in a fraction of the time previously required. Instead of having its data science team build complex scripts to query databases—a process that could take three to five days—the company uses Databricks Utilities to find answers in less than a day. When a manager in Aktify’s customer success organization needed to know how much revenue the company was generating from the first 30 days of an account, Smith found an answer in hours.  

“When colleagues come to me and ask for new tables, pipelines or extractions, I can crank them out in half a day,” reported Smith. “It’s incredible how quickly I can engineer data pipelines with Databricks and then use dbt to model the data so it will surface the right metrics. Our stakeholders see new information they couldn’t access before and they’re making better decisions, whereas before they always complained that they were flying blind.”

Databricks and dbt help Aktify meet its data needs without bursting its budget. At one point, Smith was the only member of the company’s data team, but he kept up with the demand for new insights. “dbt cuts out so many of the little tasks I used to have to do,” he said. “Without dbt, I would need at least two more people on my team. It’s saving us six figures per year by scaling the impact of the people we already have. If you’re still using SSIS, you’ll never meet your data needs because you can’t pivot fast enough and it’s too error-prone. dbt is a must-have tool.” 

Smith compares Databricks favorably to competing solutions. “Other platforms may give you similar query speeds, but that’s not the deciding factor,” Smith said. “With Databricks, you can stand up new solutions much more quickly because the open-source tooling removes barriers. That’s the kind of speed that’s most important to us.”

Using the combination of Databricks Lakehouse Platform and dbt—as well as a one-click Tableau connector in Partner Connect that allows all users to spin up their own Tableau dashboards—Aktify is getting better data processing performance from a simpler technology footprint. “We no longer stage data in Snowflake because all our data, including about 85 gigabytes of operational data, is instantly available in the Databricks Lakehouse,” Smith concluded. “Partner Connect also helps our team discover new data and AI solutions, bringing us closer to our vision of organization-wide data literacy.”