Data is changing our lifes
You may not realize it, but Your life experiences are being more curated at increasing speeds.
Every day, your gadgets and club cards are collecting unprecedented amounts of data relating to your behavior, preferences and your needs. That data is being processed and informing the type of gadgets that are being invented. The Internet of Things will see an astronomical growth in data volumes; we will enter the world of zettabytes per year. This data is set to generate billions of pounds and is changing the way we live.
Here are just a few ways data is changing your life.
Gene-based precision healthcare.
Insurance companies are identifying ways to generate streams of data that help them make better pricing decisions and encourage their policyholders to make smarter decisions.
Ons such provider is Oscar, a US-based health insurer, which provides its clients with free wearable fitness tracker. This device allows Oscar to see which policyholders prefer the couch to the gym and enables them to provide monetary incentives to encourage customers to hit the treadmill.
Of course, you already know that your wearable device is tracking every part of your day, knowing your health, workout habits, food, sleep patterns, music tastes, weight – and your who your family and friends are. This will only progress with mHealth: mobile technology, such as smartphones, that make for a cheaper, more-efficient service. Push Doctor, for instance, says you can be diagnosed in five minutes by a GP over your phone.
It’s not just your habits that are being recorded. The NHS is at the forefront of genomics, looking to build a massive database of our DNA in order to give us precision meds. While the Global Oncology Big Data Alliance (GOBDA) leverages big data analytics to optimise clinical trials, building a registry of data and helping to enable advancement in the understanding of cancer treatment globally.
Entertain Beyonce in your living room.
Imagine interacting with your favorite celebrity in your sitting room. It’s a fantasy people have had for decades – and the vision is moving closer to reality. Algorithms are translating data, such as viewing figures, into more-personalized entertainment experiences. For instance, when gaming was the most-watched category on live-streaming service Justin.tv, a new company was launched: Twitch.TV.
Fox Entertainment already uses data generated crunched from IBM supercomputer Watson to make the optimal film trailer. While start-ups such as Screen use real-time big-data analysis to map the user experience of every customer. And companies like Stremio provide an online personal entertainment library synced on all devices, that will include Netflix, Spotify, and Apple, for example.
Sensors on wearable devices will collect data on your mood. What you watch will be tailored to your location, friends, preference and emotional state as the device records your heart rate. And, as you sit in your living room watching your favorite sport, you may even be drinking popcorn-flavored Coke. Yes, the soft-drinks company’s analytics means personalized flavors are in the pipeline.
Bringing the emotion back into shopping.
Data mining from loyalty cards are changing the way we shop. Supermarkets, for instance, now know when we shop, how we will pay, and even how many calories we consume. Research shows that many people now avoid the queues and take their Christmas shopping online. But for those who prefer to hit the shops, MIT Media Lab is researching Affective Computing, including projects that help humans provide emotional feedback during a shopping experience, changing the music in each aisle based on each person’s mood.
Brands will send even more personalized advertising, from suggesting a sports drink after you finish your workout, to the perfect milk substitute to stir into your coffee when you wake up. Apps like Applepay will ensure that, with a swipe of the finger, paying for a commodity is as simple as sending a message.
These alternatives will allow customers to more seamlessly link payment directly to their bank accounts and provide users with advice and insights on their spending patterns.
The responsibility for authorizing the transaction itself will also shift through a greater use of biometrics such as fingerprints, vein recognition, voice or even selfies.
Always know when your partner’s in a huff.
Sites such as OkCupid and eHarmony are already analysing your data, so they know everything from what constitutes the perfect profile picture, to the right language to use. Dating advice is informed by these sites – and tech is being created to improve the dating experience.
For instance, lost for words on a first date? Computers could potentially feed conversation starters and live dating advice into our brains. Or are you worried about meeting a stranger alone? Why not enjoy that date in the comfort and safety of your home, where you can see them – and even hold their hand using haptic technology.
Real-time AI would then analyze video data at high speed, providing users with instant feedback about how their date is going. Sensory technology would even tell you if your date had bad breath. We could eventually rely on big data to make better long-term relationship decisions, like who to date, when to get married, and how many children to have.
The report imagines that ‘telepathic computers’ could one day also predict a partner’s behavior before it happens by studying blood-flow patterns to the brain. This information could help people decide whether they want to ditch or keep dating a partner.
Relax on a beach – in your car.
The future of all this is Smart cities. By the 2020s, it is likely you will come home to a warm home, with your favorite movie playing, and the lighting perfectly dimmed for a relaxing evening.
Assistants like Siri and Alexa will rapidly evolve from voice-only assistants to include more audio-visual and even holographic features. You may even have a virtual butler called Jarvis. Mark Zuckerberg has an AI agent that is able to control the home via voice commands.
And while you could take a driverless car for a road trip, or even commute, you could put on a headset and be transported to a beach in Hawaii anywhere. You would actually be able to smell the ocean, feel the light tropical breeze, and enjoy the sand between your toes.
HapticWavGoogle, for instance, is able to analyze your data to know you’re using your mobile in the car. They then use neural networks to create virtual environments. Using vibrations, HapticWave can layer on top of VR experiences to simulate weight, volume, and direction of virtual objects.
While if you take the Hyperloop to travel long distances quickly, you can be sat at your ‘window’, immersed in a virtual world, as the company is looking to transform their windows into video screens.
They say you get the media you deserve. Although it is hard to predict the future, whatever transpires, it will be based on what the data is defining as the needs and wants of the majority.
How do you feel about data collection? Which projects are you most excited about? Do you think data collection is going too far?
Rhian Morgan is a professional technology journalist and editor, who has worked with various media and business organizations. Her interests include AI, and the changing face of society in the wake of new tech. She is a tech writer for METRO.