Traditional data warehouses are well suited for putting data together and enabling BI teams to do data exploration and analysis, but they are not a great solution for automating operational reactions in real-time.
“Software is eating the world”, Marc Andreessen famously said, and data is clearly changing it. Awesome things happen when connecting these two worlds together.
With the introduction of software and the cloud, every competitive company evolved and achieved a high level of digitalization. Now, data has become the fuel for most of the latest and biggest changes and disruptions.
Over the last years there have been major advances in technology that have changed the data and analytics space. Lots of companies are adopting traditional data warehouses as their single source of truth, deploying ML models so they can predict what will happen next, and building data streams to capture millions of events per minute.
However, despite all these advances in Data tools, there is still a major challenge: knowing what is happening right now and taking immediate action.
The best companies out there are well known for being able to execute faster than their competition. Teams have adopted new tools and processes that let them implement changes and innovations quicker than ever.
However their ability to see what is happening is still slower than you would expect. Some of the most advanced companies have at best a 20 to 30 minutes delay as to what is happening in their business. Others must wait hours or a full day to get an outdated view.
More times than not, is often not the “real” view either, but a pre-aggregated and summarised version of what is actually going on. And that is only possible with tremendous investment in time, tools, people and, ultimately, money.
For the last decade, the Big Data movement has been about capturing a lot of data and using it to make better decisions. What’s new about the Operational Analytics approach is that it focuses not just on finding the insights, but on enabling reactions to those insights in real-time.
Thankfully, data is becoming more and more operational.
Think for example about the explosion of e-commerce and the logistics behind stock management and drop-shipping. Or how car-sharing apps calculate their trips. But it has come at a cost. These first use cases required building complex data pipelines, deploying and maintaining lots of infrastructure and recruiting big data teams, which often end up maintaining software and existing data infrastructure rather than enabling new use cases.
Solving this today boils down to using the right technology and enabling the right people to use it.
Firstly, solutions that are effective at providing a single source of truth (such as traditional data warehouses) are not effective when doing real-time data ingestion, real-time aggregations, or applying filters on the fly.
Secondly, those tools are not designed for engineering teams - who are better suited than anyone to turn that data into business value - to build data intensive applications on top, so making data accessible through APIs takes lots of extra development effort.
And last but not least, these solutions struggle to provide ultra-low latency results over large quantities of data and with a large number of concurrent requests.
We have seen lots of different teams trying to deliver operational analytics using Amazon Athena, Redshift, Google Bigquery or Snowflake. While we believe those are great data warehousing tools, Tinybird excels at enabling developers to deliver Operational Analytics in real-time and at scale, without all the hassle and through secure and dynamic APIs.
It truly enables a new mindset in which Data Engineers and Developers can make Operational Analytics come true with very little effort.
We are excited about the space and we can’t wait to see what’s next. Request access to Tinybird to understand how Operational Analytics can transform your business.