A Tiktok logo is visible in the foreground. In the background, what appears to be green coding text.

EU/Global: European Commission’s TikTok probe aims to help protect young users

Responding to the European Commission’s decision to investigate TikTok over concerns that  the online social media platform may be failing to comply with the bloc’s Digital Services Act (DSA) by not doing enough to protect young users, Damini Satija, Programme Director at Amnesty Tech, said:

“We welcome the European Commission’s decision to investigate TikTok over the possibility that it breached the DSA by failing to protect children and young people. The mental health consequences being inflicted on children and young people by the social media giant remain a longstanding concern.

“In 2023, Amnesty International’s research showed that TikTok can draw children’s accounts into dangerous rabbit holes of content that romanticizes self-harm and suicide within an hour of signing up on the platform. Children and young people also felt their TikTok use affected their schoolwork and social time with friends and led them to scroll through their feeds late at night instead of catching enough sleep.

“By design, TikTok aims to maximize engagement, which systemically undermines children’s rights. It is essential that TikTok takes urgent action to address these systemic risks. Children and young users should be offered the right to access safe platforms, and the protection of these rights cannot wait any longer.”

Damini Satija, Programme Director at Amnesty Tech

Background

In 2023, Amnesty International released two reports highlighting abuses suffered by children and young people using TikTok.

One report, Driven into the Darkness: How TikTok Encourages Self-harm and Suicidal Ideation, shows how TikTok’s pursuit of young users’ attention risks exacerbating their mental health concerns, such as depression, anxiety and self-harm.

Another report,  “I feel Exposed”: Caught in TikTok’s Surveillance Web, reveals TikTok’s rights-abusing data collection practices are sustained by harmful user engagement practices.

Both reports form part of a body of our work exposing the business models of big tech firms that prioritize profits over human rights.