‘We have a collective responsibility to get privacy right from day one’

Co-founder of the Privacy Compliance Hub, Karima Noren, reflects on the ethical obligation to prioritise privacy right at the beginning of a product or service lifecycle.

By Karima Noren

Co-founder of the Privacy Compliance Hub

May 2021

When Tristan Harris worked at Google in Silicon Valley, he became disillusioned enough to write a 144-page manifesto: A Call to Minimize Distraction and Respect Users’ Attention. “Never before in history have the decisions of a handful of designers (mostly men, white, living in San Francisco, aged 25-35) working at three companies [Google, Apple and Facebook] had so much impact on how millions of people around the world spend their attention,” he wrote. “We should feel an enormous responsibility to get this right.”

Most entrepreneurs start with an idea or a problem they think they can solve. They draft in engineers to build a product. They build a significant base of customers and clients and make plans for future growth, investment, and features to develop. Data has been collected at every stage. But no one has sat down and had a conversation about privacy. And that should be happening right at the very beginning. 

Why? Because privacy really matters. It matters to each and every one of us.

Data flows like water

In her book, Privacy is Power, Oxford University professor Carissa Véliz talks about the philosophical underpinnings of privacy and why it’s important not only for individuals but also for society. She describes it as “absurd” that technology companies have the potential to know our political leanings, who our family and friends are, what we eat, what we drink, our sexual orientation, and more, and then go on to sell that data to brokers or companies that use it to make decisions about whether we get a loan, a house, insurance, or a job. 

The dystopian potential for this kind of data sprawl is very real. “It’s very hard to stay in control of data. Data flows like water,” she said at a recent Wired event. “We need to ban the sale and sharing of personal data by companies. The advantages are few and the disadvantages are huge.” 

Recent scandals involving Cambridge Analytica, the A Level exam results debacle, and debates around contact tracing via the Covid-19 app have forced some of these conversations into the public domain. People are more aware than they’ve ever been about the importance of privacy and are increasingly looking to support companies that take this issue seriously. 

It’s not about having something to hide, or well-crafted terms and conditions about data being used responsibly now. It’s about what that information could be used for in the future. You might think that you are happy to share your DNA with 23andMe, for example, but by sharing your DNA, you are also sharing information about your children. And the reality is that you have no idea how this information could potentially be used in 10 years time. 

The need to build well

So how can we ensure privacy is protected? 

At a basic level, we need a framework, a checklist, and people from all departments who are focused on this issue. Someone needs to look ahead and ask, what are the implications of what we’re building – in the short, medium and long term? Arguably, if Facebook had put the right people in a room together to think this through, I’m sure the risk of fake news and misinformation would have been identified. Perhaps that would have led to protections being put in place ahead of time. 

Innovators have a responsibility to take these factors into consideration. To build well in the first instance, rather than reactively try to mend the dam after it’s sprung a leak. To continuously incorporate ethics assessments into every strategy and ask, what is fair, what is the right thing to do? Rather than proceed with a “collect first, ask questions later” approach that cannot be undone. 

Privacy by design demands up-front rigour and regular reviews – from research and conception to design, development, testing, and implementation. Tools such as the Data Ethics Canvas from the Open Data Institute give us a place to start having this conversation. 

In this video our Co-founders, Nigel and Karima talk about their vision for the Privacy Compliance Hub.

Watch video


A climate change-sized problem

Apple CEO Tim Cook has likened privacy to climate change, calling them the top two issues of the century. ““We’ve got climate change – that is huge. We’ve got privacy – that is huge … [we need] to decide how we can make these things better and how we can leave something for the next generation that is a lot better than our current situation,” he said in a recent interview with Fast Company.  

“I try to get somebody to think about what happens in a world where you know you’re being surveilled all the time,” he added about people who don’t care about sharing data. “What changes do you then make in your own behaviour? What do you do less of? What do you not do anymore? What are you not as curious about anymore? … that kind of world is not a world that any of us should aspire to.” 

Legislation is the first step in encouraging companies to adopt ethical data practices, but it can’t be the whole journey. Just as climate change is impossible to fight unless it’s done collectively, this war on privacy can only be defeated if we work together. We can preserve privacy if we choose to act now.  

If we all did our part in building well and caring about data privacy, the sum of all of those efforts would be incredibly powerful. This isn’t about numbers on a spreadsheet, or analytics on a screen. It’s about people. And the kind of future we want to create.

To receive our fantastic monthly newsletter, please leave your details below. We won’t use your email address for anything else and you can unsubscribe whenever you like. We are a privacy company after all!

  • This field is for validation purposes and should be left unchanged.

More to watch and read

SHARE THIS ARTICLE