Imagine you stumble across an auction. There’s an auctioneer with a gavel, standing on a stage. There are bidders, holding up their paddles to indicate their interest as the price jumps higher and higher. And there are lots for sale – the emails you’ve opened and read, your location data, all of your contacts, your recent transactions and browsing history. “It’s not creepy, it’s commerce!” the delighted auctioneer shouts.   

I was asked to speak to a group of computer science students at Imperial College London about the importance of privacy this week. Many started from the belief that they didn’t mind data tracking because they had nothing to hide. But the above scenario, as dramatised by Apple in its latest advertising campaign, shows just how invasive technology can be if we don’t keep privacy front of mind. 

I’ve written before about why everyone working in technology has an ethical obligation to prioritise privacy. But sadly that responsibility isn’t always recognised. As Tim Cook, CEO of Apple, pointed out last year on Data Privacy Day: “An interconnected ecosystem of companies and data brokers, purveyors of fake news and peddlers of division, of trackers and hucksters just looking to make a quick buck is more present in our lives than it has ever been. [And] it has never been so clear how it degrades our fundamental right to privacy first and our social fabric by consequence.“

“If we accept as normal and unavoidable that everything in our lives can be aggregated and sold, then we lose so much more than data,” he added. “We lose the freedom to be human.” 

More thoughtful systems

Apple is one technology giant that has moved to champion privacy, via its new enhanced policies of data minimisation, user control and on-device processing. But it’s not the only one banging the privacy drum – Mozilla has led on this issue for years, DuckDuckGo markets itself as a privacy-first alternative to Google, and Twitter has also revamped its privacy policy and launched the Twitter Data Dash game to teach users how to take charge of their personal information. It should, however, be noted that this comes after the American regulator fined the social media company $150m for using millions of users’ phone numbers and email addresses to target them with ads. Twitter had originally obtained that information for security purposes.   

In speaking to Imperial’s second-year students, I hoped to share with the next generation of software engineers, UX designers and startup co-founders why building well matters. That the systems they will come to create must achieve privacy by design and by default – not just because it’s the law under the GDPR but because it’s the right thing to do. During the lecture, I was heartened by their interest in privacy as a subject. They had a lot of insightful questions about what’s defined as personal information under the GDPR, and whether scraping data already in the public domain was permissible in order to build machine learning systems.  

This notion of building more thoughtful systems is increasingly becoming a core part of technology-focused degrees in the UK and in the US. At the Harvard John A.Paulson School of Engineering and Applied Sciences (SEAS), for example, one programme teaches the disciplines of computer science alongside philosophy. Students learn to identify and think through ethical issues, explain their reasoning for taking a specific action, and design systems that reflect human values. In other words, they learn to ask not ‘can I build it?’ but rather ‘should I build it, and if so, how?’

Confidence to challenge the status quo

Barbara Grosz, Higgins Research Professor of Natural Sciences at SEAS and the creator of the programme, said when it was launched in 2019: “What we need is for enough students to learn to use ethical thinking during design to make a difference in the world and to start changing the way computing technology company leaders, systems designers and programmers think about what they’re doing.”

It’s a different way of looking at the world for technologists who are more used to finding solutions to solvable technical problems, she added. Instead, they have to handle technical and ethical challenges – such as moral responsibility and respect for human rights – hand in hand. But hopefully, having these sorts of conversations will give students the confidence to stand up once they’re in the workplace and say, ‘this isn’t right, and here’s why’. 

Empowering computer science students in this way makes a lot of sense. Because what is needed is a cultural change. If the next generation of software engineers understand privacy, they will care about it. And if they care about it, they will build well. As one of the co-founders of the Privacy Compliance Hub, I’m passionate about building technology that enhances humanity. And I’m proud to do my bit to enhance the discussions students are having about privacy today, in anticipation of the technology problems they will solve tomorrow. 

Karima is the co-founder of the Privacy Compliance Hub and a former Head of Legal for Google in Emerging Markets. Karima is a multilingual technology lawyer, a renowned and engaging speaker and a serial entrepreneur with a talent for client relationships and making the complicated appear simple.