The Digital Services Act Is Protecting Children Less Than the Lawyer-Industrial Complex
Let's talk about it.
Child protection is the new black in EU’s digital enforcement strategy. Examples from 2026:
In February, the EU Commission preliminary found TikTok in breach of the Digital Services Act (DSA) for its addictive design. TikTok had failed to make the adequate risk assessment and take the proper risk mitigation measures to protect minors (link).
In March, the EU Commission preliminarily found PornHub, Stripchat, XNXX and XVideos in breach of the DSA for allowing minors to access their services (link).
On the same day, the EU Commission announced it would investigate whether Snapchat is complying with the child protection laws under the DSA (link).
In April, the EU Commission preliminarily found Meta in breach of the Digital Services Act for failing to prevent minors under 13 from using Instagram and Facebook (link).
How many working hours and resources have the EU spend on each of these investigations? I don’t have the courage to imagine it.
What I do know is that each time preliminary findings such as these are announced on LinkedIn, the digital rights community goes wild with applause and self-congratulatory praise. Finally, we are standing up to the evils of Big Tech!
On one hand, it makes sense. Protecting children from addiction, manipulation, and exposure to violence is an honorable pursuit. On the other hand, is that in fact what the EU Commission is doing with these preliminary findings?
I don’t care to investigate it. But I am fairly sure that minors can still access all of these services in the EU and that the companies haven’t made overhauling changes to the design of their platforms. Each of these preliminary findings will be contested, likely challenged in court, likely appealed, and it will take years and millions of euros in lawyer fees, before they are resolved. When all means of appeal are finally exhausted and a final decision is made available by the court, the companies will do the absolute minimum they can to comply and likely find some workaround which ensures that children can still access the platforms, so they don’t miss out on ad revenue and social engineering opportunities. Then, the process can start over.
Let’s try to understand what is really happening here by using an analogy. Your underage son or daughter routinely go to a friend’s house, where they are served alcohol by their friends’ parents. You know this for a fact. You talk to their friends’ parents about it, and they either deny it or tell you that they will stop serving your child alcohol, but continue to do it, because your child comes home drunk every time they stay at their house.
Strictly speaking, you have two choices. The first option is to not care and accept it. The second option is you forbid your child from ever going to that friends’ house again. The EU opts for a third option: it sues their child’s friend and their parents in court. They ask them to fill out documentation and a bunch of paperwork, ask them to make risk assessments about the dangers of serving children alcohol and blame them for failing to take the right precautions.
The EU has this idea, that your child’s friend and parents should act responsibly and expect that they will with enough encouragement and legal pressure. The problem is that they don’t. The friend’s parents strongly believe that children should drink alcohol, and there is nothing you can do about it. By suing them in court and insisting that the parents should act in accordance with your values, you may eventually force the parents to change their behavior as they fear the police and calls from child protection services. But at that point, your child is of a legal drinking age and the whole debacle won’t matter.
The same principle is true for the EU Commission’s enforcement of the DSA against foreign platforms. The EU doesn’t think children should watch porn or spend hours on TikTok each day. The EU try to impose their values on the platforms, because they believe deep down the platforms share the same values. But they don’t! The platforms want to show your child porn, and they want your child to spend hours every day in infinite scrolling loops. Their whole business model is undermined, if children are not allowed to do it.
All the paperwork the EU expects the platforms to do, the risk assessments, safety measures, disclosures of information, etc. is essentially a waste of time. The group of people who benefits from it are lawyers, who are paid handsomely to make year-long, comprehensive investigations which finally confirms what already we knew in the first place. In the end, they are serving children and society less than the EU’s lawyer-industrial complex.



