Brussels opens probe into TikTok algorithms after MEPs call for investigation
In 2023, MEPs highlighted the risks to privacy from the Chinese-developed social media platform and recommended banning TikTok at all levels of national government and in the EU institutions
The European Commission has opened formal proceedings to assess whether TikTok may have breached the Digital Services Act (DSA) in areas linked to the protection of minors, advertising transparency, data access for researchers, as well as the risk management of addictive design and harmful content.
In 2023, MEPs highlighted the risks of economic dependence, espionage and sabotage, when foreign companies acquire influence over EU critical infrastructure. Apart from Chinese shipping companies acquiring majority or sizeable interests in over 20 European ports, MEPs recommended banning TikTok at all levels of national government and in the EU institutions.
Cyrus Engerer calls for global push to fight radicalisation through social media
On the basis of the preliminary investigation conducted so far, including on the basis of an analysis of the risk assessment report sent by TikTok in September 2023, as well as TikTok’s replies to the Commission’s formal requests for information, the Commission has decided to open formal proceedings against TikTok under the Digital Services Act.
TikTok blocked on government devices just like other social networking sites
The proceedings will focus DSA compliance on actual or foreseeable negative effects stemming from the design of TikTok’s system, including algorithmic systems that may stimulate behavioural addictions and/ or create so-called ‘rabbit hole effects’. Such assessment is required to counter potential risks for the exercise of the fundamental right to the person’s physical and mental well-being, the respect of the rights of the child as well as its impact on radicalisation processes.
Other concerns include compliance with DSA obligations to put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security for minors; a searchable and reliable repository for advertisements presented on TikTok; and measures to increase transparency of its platform.
After the formal opening of proceedings, the Commission will continue to gather evidence, for example by sending additional requests for information, conducting interviews or inspections.
The opening of formal proceedings empowers the Commission to take further enforcement steps, such as interim measures, and non-compliance decisions. The Commission is also empowered to accept any commitment made by TikTok to remedy on the matters subject to the proceeding.
TikTok was designated as a Very Large Online Platform (VLOP) on 25 April 2023 under the EU’s Digital Services Act, following its declaration of having 135.9 million monthly active users in the EU. As a VLOP, four months from its designation, TikTok had to start complying with a series of obligations set out in the DSA.
In 2023, Amnesty International’s research showed that TikTok can draw children's accounts into dangerous rabbit holes of content that romanticizes self-harm and suicide within an hour of signing up on the platform.
“Children and young people also felt their TikTok use affected their schoolwork and social time with friends and led them to scroll through their feeds late at night instead of catching enough sleep,” Amnesty said, welcoming the investigation. “By design, TikTok aims to maximize engagement, which systemically undermines children's rights. It is essential that TikTok takes urgent action to address these systemic risks. Children and young users should be offered the right to access safe platforms, and the protection of these rights cannot wait any longer.”
This article is part of a content series called Ewropej. This is a multi-newsroom initiative part-funded by the European Parliament to bring the work of the EP closer to the citizens of Malta and keep them informed about matters that affect their daily lives. This article reflects only the author’s view. The action was co-financed by the European Union in the frame of the European Parliament's grant programme in the field of communication. The European Parliament was not involved in its preparation and is, in no case, responsible for or bound by the information or opinions expressed in the context of this action. In accordance with applicable law, the authors, interviewed people, publishers or programme broadcasters are solely responsible. The European Parliament can also not be held liable for direct or indirect damage that may result from the implementation of the action.