The story of the “invention” of Big Data and a lesson from History
I’m currently writing my script for the Dataconomy event in London in two weeks’ time.
Typically, I’ve assembled way more information than I need for a twenty minute talk, and so instead of talking really quickly and blitzing through all the slides, I’ve decided to make some cuts.
My topic is “Big Data – So What ?”, a topic which I blatantly stole from our new Marketing chief, Sean Jackson. It immediately appealed to me, because so much written about Big Data is Big Blah Blah – all about the zetabytes of web data that rocket scientists are putting to use at companies whose only product is more data. It seems like something that’s alien to people who work in traditional industries with real products and face-to-face customers.
This is a shame because Big Data, as well as having the chance to create new industries and turn old ones inside out, can also help real-life companies reap tangible returns. Every company wants to serve their customers better, run more efficiently and find more customers. These values have been around a long time, it’s just that now there’s a new tool available.
Anyway, I only have 20 minutes for my talk, and one of the topics I have chosen to drop is the story of the first use of the term “Big Data”, but I believe it’s a story worth telling.
The “invention” of Big Data
The earliest use, in my opinion, of the phrase “Big Data” with its current meaning dates back to only the 1990s and John Mashey, who was at the time the Chief Scientist of Silicon Graphics.
Silicon Graphics was a very interesting company – between 1995 and 2002, every film even nominated for a Visual Effects Oscar was produced on one of SGI’s workstation. This what they were best known for, but in fact a far bigger earner for them was government and defence contracts related to the processing of image files.
John Mashey wrote about the increase in data volumes that was going to occur once people started processing images and video and other types of non-text data. Such processing required a new way of storing the vast amounts of unstructured data and new ways of processing it, and this is what he described as “Big Data”.
One of his quotations from a 1998 presentation pretty much sums up the state of Big Data – and it is as true today as it was in 1990s:
“(Big Data is) NOT technology for technology’s sake – IT’S WHAT YOU DO WITH IT. But if you don’t understand the trends, IT’S WHAT IT WILL DO TO YOU.”
A lesson from history
Ironically, Silicon Graphics were themselves undone by an inability to understand trends. They were unable to react to advances in hardware and software which soon made general purpose Windows and Linux computers capable of performing the kind of image processing that previously required expensive specialist workstations. Image processing soon became a mainstream commodity process and by 2005 the company had to delist from the NYSE and by 2009 it was defunct.
I can’t help thinking that by using Big Data they could have seen what was coming and could have reacted to it in time. I think there’s a lesson there for us all.