Kaizen Gaming is one of the biggest GameTech companies in the world and the owner of the premium online sports betting and gaming brand, Betano. Today. Kaizen Gaming counts more than 3,000 people across the globe, while it has been consistently recognized for operational excellence and top tier customer experience. In 2024 and 2025 it received the "Operator of the Year” awards in both the EGR Operator Awards and the SBC Awards, the igaming industry’s most prestigious accolades.
Kaizen Gaming serves a large and growing customer base across multiple markets in Europe, the Americas and Africa. The company processes a high volume of transactions worldwide and must stay compliant with complex regulatory frameworks across jurisdictions.
To support these demands, Kaizen Gaming’s data organization plays a central role across the business, from marketing and engineering to finance, risk, and compliance. As the company’s footprint and ambitions expanded, the data organization has grown in lockstep. In fact, Kaizen Gaming has grown so quickly that the data team itself has more than doubled in less than six months.
With that expansion came new expectations for Kaizen Gaming’s analytics environment along with a new stage of maturity: systems and practices that had worked well at earlier stages needed to evolve to support coordination and data reliability at scale.
New demands for reliability, visibility, and operational efficiency
Kaizen Gaming’s broader data organization is structured around several domain-aligned teams. Each team maintains a high degree of autonomy, which allows them to build deep domain expertise and move quickly.
As Kaizen Gaming’s analytics organization grew, teams updated their workflows to meet their domain-specific priorities. Over time, this led to a system of well-functioning individual components, but without a shared framework to connect them. For example, pipelines were primarily developed in notebooks using multiple supported languages, including SQL, Python, and Scala.
The same flexibility that enabled domains to move quickly also made it challenging for the team to establish consistent coding practices or enforce SQL-first development broadly. As a result, three distinct challenges emerged:
- Data reliability. When Kaizen Gaming’s workflows became more complex and interconnected, the ability to validate data quality earlier in the lifecycle became increasingly important. This highlighted the need for more integrated and proactive validation across downstream reporting and analytics.
- Operational efficiency. Logic duplication across domains made it challenging to maintain consistent metrics: investigating issues could be time-consuming and testing could happen later in the cycle. At times, the team had to address issues later in development rather than earlier in the lifecycle.
- Cost and operational overhead. Managing and maintaining pipelines required a growing level of manual coordination, from updating logic to understanding dependencies across workflows. As data volumes increased, the need for a more scalable and efficient approach became apparent.
It was clear that certain aspects of the analytics environment — particularly automation, quality, standardization, and cross-team consistency — needed to change to meet new organizational demands. This was especially evident when more analytics workflows began running during off-peak hours. Without a shared framework to guide development and investigation, troubleshooting issues was painful. Diagnosing failures or subtle inconsistencies took significant effort, and even more resources if the issue was complex.
Kaizen Gaming’s analytics practices needed to evolve. As its systems became more interconnected, even small inconsistencies could require increasing effort to identify and explain. Equally difficult was the matter of assessing downstream impact, such as the effects of new features or changes. To maintain trust in the data, the company needed clearer structure, stronger validation, and shared development patterns.
To address these challenges, it decided to establish a more unified approach to analytics. Kaizen Gaming’s goals were to strengthen data quality, improve operational efficiency, and provide a consistent foundation across teams. The data team chose dbt as a core framework.
A unified analytics framework with dbt
To optimize the team’s data workflows, Yannis Lazaridis, Analytics Engineering Lead at Kaizen Gaming, began exploring the dbt platform to understand how it compared with other approaches.
The team soon found that dbt offered a more structured and transparent alternative to notebook-driven workflows. dbt’s modular modeling, version-controlled development, and explicit dependencies proved particularly valuable in enabling the broader data organization to contribute without needing to navigate Python- or Scala-heavy pipelines.
The data organization quickly adopted dbt as a common foundation across domains, establishing shared practices while maintaining flexibility. As a result, Kaizen Gaming saw the following improvements:
- A unified modeling approach. Kaizen Gaming implemented a layered structure for sources, staging, intermediate models, and marts. Kaizen Gaming’s data engineers now have a predictable way to develop, review, and maintain analytics models.
- Comprehensive, automated testing. Data quality tests now run automatically with every pull request; they combine built-in dbt tests, dbt-expectations, and custom SQL checks. This proactive approach surfaces issues sooner in the development lifecycle, before changes reach production.
- CI/CD for data quality. The team introduced Slim CI on every pull request. They also adopted state-based execution, which validates only the models that were affected by a given change during releases. As a result, the team has reduced unnecessary runtime while enabling a more automated deployment process. They now have stronger confidence in data quality while working more efficiently.
- Code-linked documentation. Models and metadata are documented directly in YAML and kept in sync with code through CI enforcement — a major improvement for data governance. This shared understanding of data structures across domains has sped up onboarding for new team members.
- Infrastructure-as-code for dbt. Jobs, environments, variables, and connections are managed through Terraform. The team can ensure consistent environment replication and stronger governance across the analytics landscape.
“With dbt, we can do data quality checks before a job is completed,” says Thomas Antonakis, Principal Analytics Engineer at Kaizen Gaming. “We have visibility into what goes wrong and can proactively stop a flow without interfering with our production tables.”
Reduced costs, faster data delivery, and increased productivity
With a standardized analytics framework in place, Kaizen Gaming began to observe clear, measurable improvements across its analytics workflows. These gains illustrate how reduced complexity, improved execution efficiency, and more consistent development practices can benefit an organization:
Workflow simplification
With dbt, the number of workflows required to support core use cases decreased by over 90%. The consolidation of analytics workflows has reduced operational complexity, making it easier to understand, operate, and maintain pipelines while reducing cognitive load for engineers.
Improved runtime and data availability
Pipeline execution times have decreased by about 60%. In fact, data is delivered more than an hour earlier than before. Downstream teams, such as product, marketing, and commercial functions, are accessing insights sooner and making faster decisions based on fresher data. “We’ve seen runtimes decrease from over two hours to around 40 minutes,” says Stefanos Nikolaou, Principal Analytics Engineer at Kaizen Gaming.
Lower operational costs
More efficient execution and reduced redundancy have led to a significant reduction in daily pipeline costs. Overall processing costs decreased by approximately 60% while supporting the same data volumes and analytical use cases as before.
Importantly, these results were achieved within a single team that rebuilt its workflows using dbt. As the broader analytics engineering and BI teams adopt similar shared practices, the data team anticipates further gains in efficiency, consistency, trust, and business value.
In the long-term, the shift to dbt has also increased architectural flexibility. “dbt gives us flexibility. By adopting it, we can remain vendor-agnostic,” says Nikolaou. “If we decide to expand our data stack in the future, it will be easier to do so.”
Looking ahead
With a standardized framework, testing strategy, and governance in place, Kaizen Gaming has established dbt as a core pillar of its analytics capabilities. What began as an effort to improve data workflows has evolved into a stronger data foundation with meaningful improvements in cost efficiency.
Ever since the data team adopted dbt, other teams at Kaizen Gaming have seen clear advances in data quality and how analytics workflows are developed, validated, and operated. As a result, interest in adopting the same framework has grown, including teams that had not previously worked with dbt.
As the analytics organization continues to transform its workflows, dbt will be rolled out more broadly to support earlier data quality checks, greater consistency, and increased trust in data production. Kaizen Gaming is ready to keep growing and views dbt Labs as an instrumental partner in its journey.



