Insights Blog

Interview with the Inventor of Our Analytic In-Memory Database

Looking back on the last 15 years with the founder of Exasol and inventor of the analytic Loading...in-memory database.

15 years have passed since Exasol was founded, 15 exciting and eventful years in a very dynamic and competitive market environment. So, it’s time to review its history in an interview with the software architect, co-founder and father of the Exasol’s in-memory database technology, Falko Mattasch.

Mr. Mattasch, how did the idea behind Exasol really come about?
In the 1990s, we were researching intelligent algorithms in the parallel processing department of the University of Jena in order to get the maximum performance out of parallel hardware (MPP, HPC computing). A research project with IMS Health, which is a leading provider of information and technology services for customers from the health service, then became the initial point from which the idea for the company then emerged. Throughout the project, we had been able to speed up a specific data problem on a massively parallel processor (the MasPar with 1024 cores) and therefore prove how data analytics can benefit from parallel processing.

Was the focus back then already on an analytic database?
No, not at all. The research work was centered on simple data aggregation with fixed data, on a single flat table as it were. If there were any changes to the structure of the data, the program had to be compiled again from scratch. Only afterwards did I implement a completely new prototype for a parallel database system, away from university and in private, which was subsequently to become the basis for the technology behind Exasol.

So, you eventually invented the in-memory database in 2000?
Well, back then you still couldn’t really call it a complete database. Because we developed everything ourselves and didn’t access existing open source database cores, it took us a few years to get there. By the same token, however, this led to an extremely well thought out software system that offered optimum performance, which at the same time scaled extremely well and was also tuning-free. Incidentally, our project was initially named DWA, the so-called Data Warehouse Accelerator. As a small development company with few designers, we started with just a manageable SQL language set. However, we can make the claim that our product was an in-memory based data analysis tool right from the start – at a point when none of the other market players were even mentioning this concept.

How could you justify putting the focus on in-memory at that time, especially when RAM prices were still so high?
I have an interesting story to tell about that, especially because back then I was engaged in quite an argument with our co-founder. His assumption was that data analysis is always carried out on a fixed set of data which barely changes. Yet, as the main contact for the technology at the time, I knew that data is continuously changed and updated, so we ultimately designed the system so that not all data had to be stored in the main memory. The main memory served – like in an operating system – as a large cache, whose content is placed onto the hard drive when it’s not possible to fit everything into the cache. Through this method, you can analyze more data much more rapidly than is the case with pure in-memory databases like SAP HANA. Nowadays, everyone talks of an in-memory era. But back then, the prospect that there would be a big fall in the price of memory was not something we were banking on.

Who was the first to use your technology?
We were lucky enough to collaborate very early on with one of the best data mining teams in Europe from leading catalogue retailer, Quelle, which was using one of the greatest Oracle RAC systems of its time. We got great feedback from the team and we were able to develop the product in a very market-oriented way. Technologically, we were therefore the uncontested leader in the database market from very early on. However, I must admit that we made the typical mistake of a German firm by putting a lot of effort into engineering and not into marketing. Competitors like Netezza were much more adept in this area, even though I believe they were not on a par with us in terms of technology. So, Exasol unfortunately remained a ‘hidden champion’ for some time. We therefore count ourselves lucky that our main investor proved to have such staying power.

What was the breakthrough in the development of Exasol as a company?
It was only in 2008 when we officially entered the market for the first time and were able to prove our technological leadership to the world with our TPC-H results. We were certain that the customers would be queuing up outside our door. However, without experienced marketing and sales departments this was not only a mistaken belief but also clearly naïve. Our breakthrough came only with the replacement of our CEO at that time and our restructuring into a commercially-orientated organization. We worked on our market position, we developed a solid sales pipeline and we eventually managed to grow the company, adding new customers year after year. Even though it was still difficult back then to market forward-looking technology in a conservative market like Germany, we persevered and ultimately became an established vendors in the market. And thanks to the many millions of Euros and Dollars that competitors such as SAP or Oracle have spent on marketing, it is now considered ‘normal’ to have in-memory database technology. Therefore, we no longer have to work so hard to convince our customers of the merits of in-memory analytics.

Are you proud of what you have achieved?
It’s fair to say that our development team was already proud right at the start. It has always been something special to develop such a complicated software solution which was technologically capable of competing with others, despite being a small German company active in a competitive market. For us developers, the technology was of course always the most important thing and it was also very fulfilling. But I must also admit that it is something quite different when you realize how many firms throughout the world are using our software for so many different data-driven applications. And when you also hear positive feedback and how satisfied users are with our technology, then it naturally makes you very proud to contribute to such a project.

What does the future look like for Exasol technology?
The data management market has become extremely agile in the last few years. Due to global digitization new data sources are constantly emerging and the number of different fields of application is increasing massively. There are always countless possibilities for further development in this area. The trend is now moving towards specialized best-of-breed solutions, which provide the ideal solution for specific problems. These systems are then interconnected with complex data management ecosystems. For example, NoSQL databases are used for operational data-processing systems, where data is saved in a kind of data lake based on Loading...Hadoop and subsequently analyzed in a structured relational data warehouse or data mart and then passed on to the analysts. This is why we have developed our technology in the past few years in the following three key areas: Performance, analytical capability and integration capacity in complex system landscapes. As a result, we are convinced that we can build on our technological leadership and persuade more and more businesses to use Exasol’s fastest analytic in-memory database in order to build data-driven businesses and gain a strong competitive advantage.

Video

00:0
Start your Journey

Get in touch today

Let us know how we can support your business.