It’s been a bad few weeks for TikTok. First, the Irish regulator announced it was submitting a draft decision to Europe’s other data protection watchdogs after investigating how the social media platform has been processing children’s information. Under the GDPR, those regulators will now decide together whether to fine and how much the fine should be.
TikTok didn’t have long to wait to hear it would be fined, and it was another regulator bearing the news. Last week, the UK’s Information Commissioner’s Office (ICO) independently sent a notice of its intent to fine TikTok up to £27m. The ICO believes the company may have processed the data of children under the age of 13 without appropriate parental consent, among other breaches of data protection law.
“We all want children to be able to learn and experience the digital world, but with proper data privacy protections,” Information Commissioner John Edwards explained on LinkedIn. “Companies providing digital services have a legal duty to put those protections in place, but our provisional view is that TikTok fell short of meeting that requirement.”
TikTok isn’t the only company in hot water over how it treats children’s privacy. Meta was fined €405m by the Irish data protection authority because Instagram allowed teenagers to set up accounts that publicly display their email addresses and phone numbers. It’s the second largest fine ever handed down under the GDPR.
An estimated one in five internet users in the UK are children. According to Ofcom, almost all (98%) children aged three to 15 go online, spending just over two hours online per day. And though technology companies are legally obliged to treat underage users with extra special care when it comes to using their personal data, some fail to do so. In fact, Ofcom found more than half (58%) of 12- to 15-year-olds expressed concern about data and privacy. After conducting her own research, the previous Information Commissioner, Elizabeth Denham told The Guardian: “they are using an internet that was not designed for them … we hear children describing data practices as ‘nosy’, ‘rude’ and a ‘bit freaky’.”
Denham was interviewed to mark the introduction of the Age Appropriate Design Code, which came into effect in the UK in September 2021. Also known as the Children’s Code, it translates the UK GDPR into 15 design standards for children and requires online services to take the “best interests” of the child into account. Those that don’t do this risk fines of up to 4% of their annual global turnover. The code is considered the international gold standard for the protection of children’s personal information, and was used as inspiration for California’s Age Appropriate Design Code Act, which will come into effect in California on 1 July 2024. A senator in New York hopes to introduce similar legislation there.
In order to comply with the code (and the GDPR) in the UK, technology companies must either prove their service is not likely to be used by under 18s at all, or make their entire offering compatible with the code. Alternatively, they can take steps to identify younger users and treat them appropriately depending on which age bracket they fall into. The understanding of a 17-year-old is very different from a six-year-old, for example. The GDPR states: “children merit specific protection with regard to their personal data, as they may be less aware of the risks, consequences and safeguards concerned and their rights in relation to the processing of personal data”.