On 24 October 2025, the European Commission stated TikTok and Meta breached the Digital Services Act (DSA)by blocking researchers from studying their algorithms across the EU. Regulators fear addictive design, mental-health harms and “rabbit hole effects”. With multibillion-dollar fines looming, the question remains: will anything actually change on your screen?
Three hours of TikTok later, you look up and wonder where your evening went. Brussels says Big Tech designed it that way and wants to stop it.
The Data They Won’t Share
On 24 October 2025, the Commission announced preliminary findings that TikTok and Meta violated DSA transparency rules by blocking researchers from accessing publicly available data. Requests stall, and data arrives incomplete or unreliable, making it impossible to study algorithmic harms. The irony: platforms accused of causing harm are now accused of hiding the evidence. Potential fines have reached 6% of global revenue, about $9.87 billion for Meta and $1.38 billion for TikTok.
What the Law Says Should Happen
Under the DSA and the newer AI Act, platforms must offer non-personalised, chronological feeds and avoid algorithms designed to fuel addiction or push users down “rabbit holes”. They’re also required to explain how recommendations work. As MEP Henna Virkkunen puts it: “Our democracies depend on trust. That means platforms must empower users, respect their rights, and open their systems to scrutiny. The DSA makes this a duty, not a choice.” The urgency is real: youth suicide rates have risen 62% since 2007, and platforms earn $11 billion annually from users under 18.
When Enforcement Moves Slower Than Algorithms
The problem? Enforcement crawls while algorithms sprint. Fourteen DSA cases are open; none have been resolved. TikTok’s probe began in February 2024. The probe into X started in 2023. TikTok is even challenging the DSA in court. As activist Jan Penfrat warns: “Users need to be protected now and not in a year or two or three from now.” Meanwhile, platforms update their algorithms constantly. By the time Brussels finishes one investigation, the system under scrutiny has already evolved.
Rules are on paper, investigations are live yet users still face feeds shaped by systems no one outside the company can control.
