TikTok hit with consumer, child safety and privacy complaints in Europe
TikTok is facing a fresh round of regulatory complaints in Europe where consumer protection groups have filed a series of coordinated complaints alleging multiple breaches of EU law.
The European Consumer Organisation (BEUC) has lodged a complaint against the video sharing site with the European Commission and the bloc's network of consumer protection authorities, while consumer organisations in 15 countries have alerted their national authorities and urged them to investigate the social media giant's conduct, BEUC said today.
The complaints include claims of unfair terms, including in relation to copyright and TikTok's virtual currency; concerns around the type of content children are being exposed to on the platform; and accusations of misleading data processing and privacy practices.
Details of the alleged breaches are set out in two reports associated with the complaints: One covering issues with TikTok's approach to consumer protection, and another focused on data protection and privacy.
Child safetyThanks to the amazing @AppCensusInc @Jausl00s & @ValeVerdo, we could assess #TikTok's privacy policy and technical data processing aspects. @beuc notified @EU_EDPB and urges data protection authorities that are investigating #TikTok to act on the findings: https://t.co/4OUeYCnjwU
- Maryant Fernandez (@maryantfp) February 16, 2021
On child safety, the report accuses TikTok of failing to protect children and teenagers from hidden advertising and potentially harmful" content on its platform.
TikTok's marketing offers to companies who want to advertise on the app contributes to the proliferation of hidden marketing. Users are for instance triggered to participate in branded hashtag challenges where they are encouraged to create content of specific products. As popular influencers are often the starting point of such challenges the commercial intent is usually masked for users. TikTok is also potentially failing to conduct due diligence when it comes to protecting children from inappropriate content such as videos showing suggestive content which are just a few scrolls away," the BEUC writes in a press release.
TikTok has already faced a regulatory intervention in Italy this year in response to child safety concerns - in that instance after the death of a ten year old girl in the country. Local media had reported that the child died of asphyxiation after participating in a black out' challenge on TikTok - triggering the emergency intervention by the DPA.
Soon afterwards TikTok agreed to reissue an age gate to verify the age of every user in Italy, although the check merely asks the user to input a date to confirm their age so seems trivially easy to circumvent.
TikTok will recheck the age of every user in Italy after DPA order
In the BEUC's report, the consumer rights group draws attention to TikTok's flimsy age gate, writing that: In practice, it is very easy for underage users to register on the platform as the age verification process is very loose and only self-declaratory."
And while it notes TikTok's privacy policy claims the service is not directed at children under the age of 13" the report cites a number of studies that found heavy use of TikTok by children under 13 - with BEUC suggesting that children in fact make up a very big part" of TikTok's user base.
From the report:
In France, 45% of children below 13 have indicated using the app. In the United Kingdom, a 2020 study from the Office for Telecommunications (OFCOM) revealed that 50% of children between eight and 15 upload videos on TikTok at least weekly. In Czech Republic, a 2019 study found out that TikTok is very popular among children aged 11-12. In Norway, a news article reported that 32% of children aged 10-11 used TikTok in 2019. In the United States, The New York Times revealed that more than one-third of daily TikTok users are 14 or younger, and many videos seem to come from children who are below 13. The fact that many underage users are active on the platform does not come as a surprise as recent studies have shown that, on average, a majority of children owns mobile phones earlier and earlier (for example, by the age of seven in the UK).
A recent EU-backed study also found that age checks on popular social media platforms are basically ineffective" as they can be circumvented by children of all ages simply by lying about their age.
Terms of useAnother issue raised by the complaints centers on a claim of unfair terms of use - including in relation to copyright, with BEUC noting that TikTok's T&Cs give it an irrevocable right to use, distribute and reproduce the videos published by users, without remuneration".
A virtual currency feature it offers is also highlighted as problematic in consumer rights terms.
TikTok lets users purchase digital coins which they can use to buy virtual gifts for other users (which can in turn be converted by the user back to fiat). But BEUC says its Virtual Item Policy' contains unfair terms and misleading practices" - pointing to how it claims an absolute right" to modify the exchange rate between the coins and the gifts, thereby potentially skewing the financial transaction in its own favour".
While TikTok displays the price to buy packs of its virtual coins there is no clarity over the process it applies for the conversion of these gifts into in-app diamonds (which the gift-receiving user can choose to redeem for actual money, remitted to them via PayPal or another third party payment processing tool).
The amount of the final monetary compensation that is ultimately earned by the content provider remains obscure," BEUC writes in the report, adding: According to TikTok, the compensation is calculated based on various factors including the number of diamonds that the user has accrued'... TikTok does not indicate how much the app retains when content providers decide to convert their diamonds into cash."
Playful at a first glance, TikTok's Virtual Item Policy is highly problematic from the point of view of consumer rights," it adds.
PrivacyTikTok found to have tracked Android users' MAC addresses until late last year
On data protection and privacy, the social media platform is also accused of a whole litany of misleading" practices - including (again) in relation to children. Here the complaint accuses TikTok of failing to clearly inform users about what personal data is collected, for what purpose, and for what legal reason - as is required under Europe's General Data Protection Regulation (GDPR).
Other issues flagged in the report include the lack of any opt-out from personal data being processed for advertising (aka forced consent' - something tech giants like Facebook and Google have also been accused); the lack of explicit consent for processing sensitive personal data (which has special protections under GDPR); and an absence of security and data protection by design, among other issues.
We've reached out to the Irish Data Protection Commission (DPC), which is TikTok's lead supervisor for data protection issues in the EU, about the complaint and will update this report with any response.
France's data watchdog, the CNIL, already opened an investigation into TikTok last year - prior to the company shifting its regional legal base to Ireland (meaning data protection complaints must now be funnelled through the Irish DPC as a result of via the GDPR's one-stop-shop mechanism - adding to the regulatory backlog).
Jef Ausloos, a postdoc researcher who worked on the legal analysis of TikTok's privacy policy for the data protection complaints, told TechCrunch researchers had been ready to file data protection complaints a year ago - at a time when the platform had no age check at all - but it suddenly made major changes to how it operates.
Ausloos suggests such sudden massive shifts are a deliberate tactic to evade regulatory scrutiny of data-exploiting practices - as constant flux" can have the effect of derailing and/or resetting research work being undertaken to build a case for enforcement - also pointing out that resource-strapped regulators may be reluctant to bring cases against companies after the fact' (i.e. if they've since changed a practice).
The upshot of breaches that iterate is that repeat violations of the law may never be enforced.
It's a clear strategy of big tech companies that built their business model on data exploitation, to remain in constant flux, rendering it hard/impossible for cases against them to crystallise and gain momentum
- Jef Ausloos (@Jausl00s) February 16, 2021
It's also true that a frequent refrain of platforms at the point of being called out (or called up) on specific business practices is to claim they've since changed how they operate - seeking to use that a defence to limit the impact of regulatory enforcement or indeed a legal ruling. (Aka: Move fast and break regulatory accountability'.)
Nonetheless, Ausloos says the complainants' hope now is that the two years of documentation undertaken on the TikTok case will help DPAs build cases.
Commenting on the complaints in a statement, Monique Goyens, DG of BEUC, said: In just a few years, TikTok has become one of the most popular social media apps with millions of users across Europe. But TikTok is letting its users down by breaching their rights on a massive scale. We have discovered a whole series of consumer rights infringements and therefore filed a complaint against TikTok.
Children love TikTok but the company fails to keep them protected. We do not want our youngest ones to be exposed to pervasive hidden advertising and unknowingly turned into billboards when they are just trying to have fun.
Together with our members - consumer groups from across Europe - we urge authorities to take swift action. They must act now to make sure TikTok is a place where consumers, especially children, can enjoy themselves without being deprived of their rights."
Reached for comment on the complaints, a TikTok spokesperson told us:
Keeping our community safe, especially our younger users, and complying with the laws where we operate are responsibilities we take incredibly seriously. Every day we work hard to protect our community which is why we have taken a range of major steps, including making all accounts belonging to users under 16 private by default. We've also developed an in-app summary of our Privacy Policy with vocabulary and a tone of voice that makes it easier for teens to understand our approach to privacy. We're always open to hearing how we can improve, and we have contacted BEUC as we would welcome a meeting to listen to their concerns.