Article by: Robert Brown, AVP, Cognizant’s Center of the Future of Work
Welcome to the brave new world of GDPR, which came into effect on May 25, 2018. For weeks now, inboxes have been brimming with notices from companies that, liked a spurned lover, beg of people ‘please come back! We miss you!’ News reporting of the great ‘privacy watershed moment’ even varied its perspective based on country.
Media outlets in the UK largely decried the ‘spamming by companies to get people to accept new terms and conditions’, whereas in France, companies were portrayed as simply sending e-mails with privacy policies that had been updated with attendant links to ‘learn more about it’. Meanwhile, tech giants like Facebook and Google faced immediate legal filings over perceptions of ‘forced consent’.
As GDPR comes into effect, the first phase of the Internet – the Wild West Days – come to an end and it is the perfect back drop to discuss the power of disruption, with ALL of its positive and negative consequences.
At the root of all this is of course the even braver new world of the future of our privacy – in Europe, China, America and elsewhere, the new machines of the digital age reign. There is no question that the age of algorithms, automation and AI has resulted in great leaps forward for humanity in terms of personal recommendations, customised experiences and lightning-fast convenience. All at the cost of sharing our personal information.
Today’s digital age is the proverbial double-edged sword and our privacy is increasingly the hilt of that blade. With every click, like and swipe we make online, our interests, preferences and intent are revealed and contained. The ubiquity of location-based sensors, facial recognition and social and mobile computing have made consumers subject to vast and lucrative analysis for companies every day. As The Police sang in the ‘80s, ‘every move you make’ in the online world is visible to not only those we trust, but also those we do not know even exist.
The restoration of the sovereignty of data privacy
Revelations like Cambridge Analytica’s exploits – without these users’ direct consent – have brought this under a white-hot heat lamp of scrutiny. Questions continue to be asked in Brussels, the House of Commons and on Capitol Hill. After 25 years of regulatory-light experimentation with the ‘information superhighway’, policy makers are beginning to lay down serious ‘rules of the road’ pertaining to data privacy, largely spurred by what the EU has put in place the past few years as a policy pacesetter, culminating with GDPR.
Regulation for privacy is a subplot to the great story of our time that is AI, but it runs the risk of necessarily side-tracking the excitement of possibilities in the digital age.
When it comes to bolstering privacy and trust in the Age of the Algorithm, here is how we begin the restoration. The following is a strategic list for all companies of six critical actions – three things to start doing and three things to stop doing – to help data privacy flourish in the digital age.
Start:
1. Innovating new roles like the Chief Trust Officer at the executive level. Trust is an amorphous concept for which every employee of an organisation has implicit – but not explicit – responsibility. This must change. Trust is now a competitive factor for every business. A Chief Trust Officer (reporting directly to the CEO and a peer to the CFO and general counsel) should work closely with data protection officers (now mandated by GDPR) to oversee privacy and customer advocacy, thus ensuring digital innovations thrive. They’ll certify that monetisation of data conforms to ethical guidelines and key performance indicators.
2. Promoting public policy that rewards good privacy ethics. The closer you are to the debate – even if it means squirming through testimony in Brussels, Bern, Berlin or Westminster – the more influence you can have on the future.
3. Ensuring privacy protection initiatives for metadata. Submitted customer data (e.g., comments, pictures, etc.) – and the ability to edit or delete it – is one thing. But it’s customers’ metadata (or ‘contextual data’ in the PII parlance of GDPR) that’s the bigger deal. We’re already seeing moves from players like Facebook to establish a ‘clear history’ feature – somewhat like an angioplasty for customers’ digital footprints.
Stop:
1. Taking things like ethics for granted. While ‘move quickly and break things’ sounded great a few years ago, the tide has undeniably turned. The days of the ‘data debutantes’ are over, since the consequence of betting the brand on questionable use of data is the disappearance of customers. As the backlash grows, there’s a very real possibility that new jobs of the future like personal data brokers will emerge to help customers manage the monetisation of their own data.
2. Thinking of GDPR as the enemy. The absence of trust is antitrust and your mindset needs to embrace one simple fact: love it or hate it, GDPR regulation is your new best friend. Legislative sea changes of this type could be the raw fuel that impels business success in the future.
3. Overreacting. Of course, corrections and pivots on the road to the future of privacy will be natural. That does not mean innovation is over, but let ethics (and the law) help your organisation walk the line between leading edge and bleeding edge. Capitulating to fear and shutting down digital innovation is the worst thing any organisation can do.
While the fundamentals of these questions have always been with us, the future now rests on how we treat and manage data. The long view of the future of privacy is that corporate leaders, companies and countries that do this successfully – through ethics, responsible practices and yes, healthy regulation like GDPR – will participate in a new golden age of digital practice.