Insights Blog

How to avoid common mistakes with your data migration

Avoid common mistakes with your data migration

Deciding where you keep your data and the platform you use for your analytics is vital for your business. But with so many options and potential pitfalls, choosing the best route for data migration can seem daunting. In our latest research paper, “Accelerating business analytics performance in 2021”, we investigate the key challenges you’ll face – and importantly, assess how you can avoid them.

Predicting your data needs in the future

One of the main challenges with any data migration is knowing not only if your chosen solution it suitable for your business today, but will it still be relevant for your evolving needs in the future? As one data leader we interviewed says, “nobody wants to re-platform in 2-3 years’ time.”

Focusing too much on cost savings

Often with budget pressure and an understandable focus on the bottom line, organisations simply opt for the platform which saves them the most money. This might be a cloud migration, for example. But it can be a false economy as you need to make sure you’ve got enough compute power to deliver the analytics and insights you need. And there are a range of practical considerations which are overlooked.

As Peter Jackson, Chief Data Officer at Exasol says, as well as cost, “risk is something else to consider. Security is another huge consideration for a migration. And you also need to think if you have the right skills to carry it out?”

Forgetting about analytics speed and performance 

“The performance and analytics side is only going to get bigger,” says one data leader in the food industry. Many organisations overlook the speed and power of the database you need to do advanced data analytics. Yet this is vital to meet increased business demand for insights. And it’s a trend which is only set to accelerate.

As Peter Jackson at Exasol says, “if you don’t have a high performing database, you won’t be able to cope with the variety of data. You won’t be able to cope with the volume of data. And you certainly won’t be able to cope with the velocity you need.”

Accelerating performance without re-engineering

There’s no one-size-fits-all for a migration, but we believe the key to success means shifting how you approach the challenge. What’s key is having the ability to add an acceleration layer with your Loading...analytics database, without needing to re-engineer your legacy systems.

To do this, we recommend looking at an analytics database with columnar storage, massive parallel processing, and Loading...in-memory analytics.

Columnar storage

Columnar storage makes analyzing queries quicker by flipping traditional row-based data by 90%, into columns. So, you can get to the data you need rather than trawling through the whole table. It’s what enables many organizations to do Loading...real-time analytics, running queries in a matter of seconds. One data leader says it’s given them the ability to, “effectively integrate and aggregate data from lots of different sources in the same data lake.”

Massive parallel processing 

Massive parallel process can supercharge your analytics queries too. It does this by running programs simultaneously with each one focusing on a single job. In short, it means you’ll get your insights a lot faster and be able to do more with them. For example, you’ll have the ability to look at data at a more granular level, as one data team in health and beauty sector are doing, to better understand customer behavior.

In-memory analytics

In-memory processing enables you to analyse data right in the computer memory (RAM). This means you won’t be slowed down by latency when the data is read by other programs in your infrastructure. It’s how Piedmont Healthcare were able to cut the processing time for their self-service hospital billing from 6 hours to 4 minutes.

Data migration – want to know more?

Read our full report on “Accelerating business analytics performance in 2021” where we bring all our findings together, including:

  • an in-depth guide to accelerating your analytics without re-engineering your legacy systems.
  • a look ahead to the future of data infrastructure.
  • a spotlight on how in-memory processing is the secret sauce for Domino’s data team.