It’s been a long time coming but TikTok has finally been found in breach of the European Union’s General Data Protection Regulation (GDPR) in relation to its handling of children’s data. Under the decision issued today by the Irish Data Protection Commission (DPC), the video sharing platform has been reprimanded and fined €345 million (~$379M). It has also been ordered to bring its offending data processing into compliance within three months.
In all TikTok has been found to have violated the following eight articles of the GDPR: 5(1)(a); 5(1)(c); 5(1)(f); 24(1); 25(1); 25(2); 12(1); and 13(1)(e) — aka breaches of lawfulness, fairness and transparency of data processing; data minimization; data security; responsibility of the controller; data protection by design and default; and the rights of the data subject (including minors) to receive clear communications about data processing; and to receive information on recipients of their personal data. So it’s quite the laundry list of failings.
The decision did not find a breach in relation to methods used by TikTok for age verification, which has been a flash point for it with a number of regional regulators, but the Irish watchdog notes the decision does record a violation of Article 24(1) of the GDPR — as it found TikTok did not implement appropriate technical and organisational measures since it did not properly consider certain risks posed to under 13s who gained access to the platform as the default account setting allowed anyone (on or off TikTok) to view social media content posted by those users.
Settings TikTok had implemented at this time were found to have enabled child users to progress through the sign-up process in such a manner that their accounts were set to public by default. “This also meant that, for example, videos that were posted to child users’ account were public-by-default, comments were enabled publicly by default, the ‘Duet’ and ‘Stitch’ features were enabled by default,” the DPC notes.
A child’s account could also be “paired” with an unverified non-child user — via a so-called “Family Pairing” feature — but TikTok did not verify whether the user was actually the child user’s parent or guardian. The non-child user could use the feature to enable direct messages for child users above the age of 16 — “thereby making this feature less strict for the child user”, per the DPC’s findings.
Responding to the decision, a TikTok spokesperson sent us this statement:
We respectfully disagree with the decision, particularly the level of the fine imposed. The DPC’s criticisms are focused on features and settings that were in place three years ago, and that we made changes to well before the investigation even began, such as setting all under 16 accounts to private by default.
TikTok also told us it is considering its next steps in light of the sanction. So the platform could seek to file a legal appeal in Ireland.
In a longer response posted to its website, Elaine Fox, TikTok’s head of privacy in Europe, elaborated on measures she said the company took to address safety concerns prior to the DPC’s investigation beginning, such as setting accounts of users aged 13-15 private by default.
She also claimed that in 2021 TikTok became the first (“and remain[s] the only”) major platform to publicly disclose the number of suspected underage accounts it removes. “We publish this in our quarterly Community Guideline Enforcement Reports and during the first three months of 2023, we removed nearly 17 million such accounts globally,” she wrote, adding: “Age assurance is an industry-wide challenge. We will continue to engage with regulators and other experts to identify new solutions that further enhance our efforts to keep underage users off the platform.”
Per the blog post, TikTok has more than 134 million monthly active users across the European Union.
Unsafe by default
The DPC’s child data TikTok enquiry focused on a five month period (July 31, 2020 to December 31, 2020) — looking at whether TikTok complied with its obligations under the GDPR in relation to its processing of personal data relating to child users of the platform in the context of certain platform settings (including public-by-default settings; and settings associated with the aforementioned “Family Pairing” feature); as well as examining age verification as part of the registration process.
The DPC also looked at “certain” transparency obligations, including how information was provided to child users in relation to default settings.
Its preliminary findings (draft decision) found slightly fewer breaches of the GDPR than have been confirmed in the today’s final decision. But objections were raised to its draft decision by two other authorities (Italy’s DPA and the Berlin authority) and the disagreement was passed the European Data Protection Board (EDPB) to take a binding decision — which agreed there should also be a finding of a breach of the GDPR’s fairness principle. The Board also ordered Ireland to extend the scope of the order to bring processing into compliance to refer to the remedial work required to address the fairness breach.
The DPC’s final decision was adopted on September 1, 2023 — suggesting TikTok has until the start of December to rectify its GDPR compliance or risk further sanction.
Although the company’s contention is it has already fixed the bulk of the issues it’s being sanctioned for today — hence its “particular” objection to the level of fine.
The UK’s privacy regulator, the ICO, issued its own penalty on TikTok earlier this year — also in relation to its handling of children’s data — handing down a fine of ~$15.7M for breaching the UK’s data protection regime between May 2018 and July 2020, including for failing to prevent an estimated 1.4 million underage users from accessing its platform.
A more sizeable GDPR fine was handed down in the EU on Meta-owned Instagram last year also in relation to data protection violations affecting children. In that case the tech giant was sanctioned €405 million at the end of a DPC enquiry that started back in October 2020.
Sanctions relating to child protection concerns continue to account for some of the biggest penalties handed down by European privacy regulators in recent years. Although the sums involved still remain a ways off the largest GDPR sanction to date: A €1.2BN penalty for Meta’s illegal data transfers.
That may not be much comfort to TikTok, however, given its own data exports remain under investigation in the EU. The DPC’s deputy commissioner, Graham Doyle, told TechCrunch it hopes to be able to submit a draft decision on this second TikTok probe, focused on data transfers, to other regional data protection authorities for review by the end of the year. (A final decision, therefore, should come in 2024 — with the exact timing depending on whether other authorities disagree with Ireland’s preliminary findings.)
The EDPB has been called to take binding decisions on a number of Ireland-led GDPR investigations on Big Tech since the regulation came into force. In all cases the resulting sanctions have been stepped up via the Board’s intervention — sometimes substantially and often both in terms of the size of the financial penalties issued and the scope of the breach findings.
Pressure to act
The Irish regulator opened the two aforementioned TikTok probes, into data transfers and the one related to today’s decision on the processing of minors’ data, two years ago. The move followed pressure from other EU data protection authorities and consumers protection groups which had raised concerns about how the platform handles’ user data generally and children’s information specifically.
Earlier the same year Italy’s data protection authority took emergency action against TikTok over child safety concerns. Its interventions led to the platform rechecking the age of every user in the country and purging over half a million accounts which it could not verify did not belong to minors under the age of 13.
Around this time EU consumer protection authorities also raised a series of red flags over privacy and child safety concerns. But it still took several more months before the Irish regulator announced its enquiry.
The sluggish response to child safety concerns arising from kids’ use of TikTok contributed to the DPC’s commissioner, Helen Dixon, being on the receiving end of some hostile questioning by MEPs during a hearing in the European Parliament earlier this year. EU lawmakers also raised wider concerns about the regulator’s approach — wondering whether the Irish regulator up to the job of enforcing the GDPR on major tech platforms.
Dixon responded with a robust defence of what she claimed is “busy GDPR enforcement” by the Irish authority. On TikTok specifically she claimed the DPC is working as fast as it can given the large volumes of material being examined.