Insight Blog

3 reasons fast databases will never be fast enough

If you look at the data management market (e.g. the Data Platforms Map from the 451 Group), you are likely to be overwhelmed by the diversity of today’s technologies.

Many decision makers think that this broad range of choices is too complex to consider. As a consequence, they stick with the ‘same old, same old’ of the big brands such as Oracle, Microsoft or IBM and assure themselves that disruptive technologies will be adopted by these vendors, even if it takes a little longer. But in the long term they will probably integrate it, or simply acquire adequate smaller market players. Are you a decision maker that thinks that way? Read on to find out the pitfalls in this strategy.

I probably do not need to emphasize how important data management has become in the recent years. Nor do I need to mention that its strategic role will increase in the future. Data-driven automated decisions are helping organizations across the globe, and artificial intelligence algorithms are revolutionizing the way we think about complicated problems. Magazines are full of exciting use cases that improve our lives and change the market rules for nearly every industry. So let’s take it as given that collecting, processing and analyzing data is one of the crucial factors for a company‘s competitiveness.

Coming back to the large vendors who tell their customer-base that their wide software stacks ensure compatibility, that the acquired technologies have or will be integrated deeply into their huge platforms, and that buying everything from one vendor has lots of advantages. The majority of the experienced decision makers seem to be tired and disappointed by these promises. The new generation, the millennials, do not even consider the big players as feasible software suppliers. They prefer best-of-breed solutions that solve their challenges in an optimal way. They won’t believe the marketing before they have tested the software. And they want dedicated service providers who are approachable and deliver first-class support and enablement.

Don’t believe in the fallacy that the increasing hardware power will fix your current performance limitations automatically. Or that ordinary solutions are ‘fast enough’, just because they can cope with your current needs. Here are the three most important reasons why this is a dangerous calculation, and why fast databases will never be fast enough:

1) First Never Follows

If you want to be the number one of your market, then you should follow that ad slogan of our client Adidas. Second-best leads to a competitive disadvantage compared to the ones who choose the best. Leveraging data has become one of the most important competitive advantages, which means you may not be satisfied with a fast database – you will need the fastest. Remember the complex landscape from above – there are different ‘fastest’ solutions for different use cases. But don’t believe the vendors (not even my own company) –  test the different options in a thorough proof of concept.

2) Data will grow over your head

Maybe you are able to predict the data growth of your data sources? This might be true for today, but there will probably be changes in your company that will lead to a data explosion. IoT products, sensor logs, social media data or a fine-granular supply chain surveillance from your suppliers to your customers – these are all reasons why data growth will never be linear again.

In the past few years, it has become feasible to process (e.g. via NoSQL solutions, data grids and streaming systems) and store more and more data in a cost-effective way (especially thanks to Hadoop). But in many cases, that data already can’t be analyzed appropriately anymore.

More data means more to analyze, so even if you are happy with your database performance today, you might not be tomorrow.

3) Limitations and constraints avoid your success

Have you ever been in the situation where your existing database held back your imagination? For instance, you would not even think of evaluating your whole customer data from the past 10 years as you are used to being limited to the past 3 months?

One of my close friends works for a company that has a special committee that meets once a week to decide whether new analysis can be executed in their existing Oracle database. And a DBA at one of our clients told us that if he had thrown a certain query against their productive Teradata database, he would probably have been fired immediately. Can you imagine how this constraints innovation?

Many questions are not answered in analytics teams, many applications are not implemented due to limitations, and I assume that lots of business models have not been created as consequence.

Summary

Ultra-fast data analysis in the speed of thoughts leads to new questions, new analysis, new applications and eventually more success. Your requirements will tremendously change tomorrow. Accessible data volumes will explode. And your competition is not hesitant to pass this by. So don’t take the easy decision to stick with your existing vendors. Don’t hesitate and invest in market research and testing projects. Make sure you are using state-of-the-art technology, and prepare yourself for the future.