Thu, 17 May 2018 11:17:48 GMT
Ninety per cent of all the data in existence was created in the last two years.
In the UK alone, 2.5 billion gigabytes of data are created every day. In that same 24 hours, Twitter churns out 500 million messages and Netflix users worldwide stream 140 million hours of content.
It’s not news that we live in a world of Big Data. But our rush to ride the wave of technological innovation may preface a more sinister phenomenon: the tumble into a society of surveillance.
(Don’t worry – I won’t start dropping Orwellian clichés here…)
States don’t need to covertly watch us. We routinely, voluntarily surrender personal data in exchange for services that make our lives incrementally easier. We haven’t made it hard. And it’s easy to lose track of who knows everything about you – and how they can use it.
The recent popularity of data-dystopia dramas, such as Netflix’s Black Mirror, shows that many people are starting to feel uneasy about the consequences. And when we have the likes of Elon Musk and the late Stephen Hawking warning us about the potentially hazardous consequences of AI, those concerns feel justified.
The most recent series of Black Mirror tapped into this collective anxiety in characteristically disturbing style.
The episode ‘Crocodile’ depicts a world in which the latest form of surveillance occurs directly inside people’s minds. A device placed on the temple grants unfiltered access to a person’s memories. This enables police and insurance adjusters to construct pixel-perfect accounts of crime scenes from a composite of witnesses’ memories.
Great, right? Questionable…
(Spoiler alert) This supposedly benign invention gives rise to significant abuses of privacy, prolific criminality and personal ruin.
And yet the device in ‘Crocodile’ is merely the logical extension of many current digital services. What is Facebook if not your digital memory bank? Let’s do away with the interface and tap directly into the source!
So, at what point is this a breach of privacy?
The problem is that it’s fairly subjective. Which is what makes it so hard to hit the right note when our industry begins mining data for commercial purposes.
Broadly, it comes down to a sense of agency, feeling in control. I permit Facebook to use my data because I curate it. The moment you remove my ability to choose, you rob me of my privacy. And that’s creepy.
Even a handful of ‘likes’ can be used to automatically predict a range of highly sensitive personal attributes: your sexual orientation, political views, even your intelligence and happiness levels.
It doesn’t take Black Mirror levels of pessimism to see the risks.
For a digital agency, data is intrinsic to creative output. But with GDPR coming into effect in May 2018, and the penalty for non-compliance being up to €20 million or 4% of a company’s annual turnover, there are some challenges ahead.
Multinational media companies need to work out how to positively position data capture. And, in the wake of the Cambridge Analytica scandal, with trust at an all-time low, they’re working hard to reassure an increasingly sceptical customer base.
Our agency approach to GDPR has been to help clients embrace its parameters rather than fear them, as a way to build genuine customer trust and achieve more meaningful engagement.
That seems like common sense in a world where media giants increasingly use sophisticated inference-based algorithms to mine information we might not believe we have consented to.
After all, we may have become used to carrying a tracking device in the form of our smartphones, but most of us still draw our curtains when we get home and wouldn’t live-stream the contents of our bedroom. We need control.
It’s time to think further than just next week’s KPIs. Businesses need to approach data with a moral compass. Let’s create a universalist rule for data security: treat other people’s data as you would like yours to be treated.
The onus to protect customer data is now fully on service-providers, with a high penalty for failure. Which means using business interests to protect the individual. And with people increasingly more security savvy, businesses that lose trust lose their most important asset: the customer.
We should view the new data protection laws not as a constraint but an opportunity – a new brief for customer-centricity.
Our industry pivots on Big Data. But in the words of Uncle Ben Parker: with great power comes great responsibility. Nowadays, we can add ‘great scrutiny’, too.
Alec Barr is User Experience Architect at Proximity Worldwide