This week, the European Union proposed a new regulation to combat child sexual abuse.

For some, this proposal is an attack on privacy and our democratic values, for others it just lays down moderate rules to protect children. Unfortunately, the proposal is not trivial to understand and estimating its effectiveness isn't easy either.

In this article, we will break down the proposal into simple words. We will have a look at the rules described in the proposal and how they would affect us.

Summary of the proposal

This is a short summary that omits a lot of details that are less relevant for understanding the overall idea. If you want to learn more about the proposal, you can find the original text here.

The goal

The goal of the proposal is to protect children from sexual abuse on the internet and to prevent the distribution of child sexual abuse material (CSAM). In particular, this means protecting children from grooming and to prevent the distribution of child pornography.

Risk assessment

The first obligation for providers is to perform a risk assessment. Providers must estimate how likely it is that their service will be used for online child sexual abuse.

A provider is essentially every online service that stores and provides information. In other words, this regulation targets every service from online forums to email and messenger services.

Risk migration

Based on the risks identified during the risk assessment, a provider should take reasonable measures to reduce the risk of online child sexual abuse. There are few requirements for such measures, but they shall be effective and proportionate to the risk, the size of the provider and the number of its users.

For communication services (e.g. messengers), age verification is suggested as a measure if they identified a risk for solicitation of children on their platform.

The measures taken should also be considered in the risk assessment. Both the assessment and the description of measures must be transmitted to the responsible authority.

Detection orders

The authorities of the EU and its member states will look through the assessment and decide whether the remaining risk is high enough to issue a detection order. The detection order forces providers to use technologies to detect CSAM on their platforms. Such technologies shall be effective and only able to extract the necessary information. The provider can choose any technologies that are effective enough and must inform users that such technologies are used.

The detection order is limited in time and without a detection order, services are not allowed to perform CSAM detection.

Reporting

If a provider detects potential online child sexual abuse, it must inform the concerned user and submit a report to the EU Centre, an agency of the EU to combat child sexual abuse. The report must include the content and information to identify the user, such as the IP-address.

How effective is the proposal?

We absolutely agree that protection of children on the internet has been a longstanding issue and definitely must be addressed in some form. However, this doesn't mean that the proposal is automatically good because it has the right goals.

To really understand the impact of the proposal, we have to look at two separate categories.

Forums and public networks

In our democracy, it's crucial that we may express our opinion publicly. On platforms like Twitter or Reddit, we can discuss any topic publicly and everyone can read our posts and comments.

Just like in a public square or marketplace, illegal activities should be reported and addressed quickly. For example, if someone sold hard drives storing child pornography at Trafalgar Square in London, a slow response by the police would be unacceptable.

The same applies to public networks online. It is not acceptable that illegal content is available on platforms like Reddit or Twitter, where everyone, including children, has access. However in most cases, good moderation and reporting by users will be effective enough. Detection orders that require scanning content before it's published inevitably lead to upload filters that will create a lot of false positives.

The risk assessments and the encouragement of services to take proactive measures is certainly a good idea to improve the current situation. Detection orders on the other side should be treated with caution. It's still to be determined whether other measures will be enough to fight against CSAM. The vague definition of when a detection order can be issued only raises concerns further that those rules might be misused under the discretion of the responsible authority.

Private communication

Privacy is a crucial value in our democracies. It's good that we can keep our thoughts to ourselves and don't need to be transparent about our opinions in every situation. We can discuss personal topics with only our friends and family and can be certain that nobody else will follow those conversations.

Similarly, we have end-to-end encryption (E2EE) in the digital world. It's a powerful mechanism to exchange information privately, even if others are listening. Without E2EE, many online services such as online banking wouldn't be possible because you need an encrypted connection before entering your password. Otherwise, any exchange point between you and the banking server (e.g. your ISP) could read all your passwords.

In the current state of the proposal, it seems very likely that messenger services such as WhatsApp, Signal or Matrix will be considered risky enough for a detection order. However, most messengers use E2EE to encrypt their messages. A detection order for such a messenger would be simply devastating.

By definition, E2EE means that only the sender and the recipient can read the message. Any service provider in between can, of course, store and read those messages in their encrypted state, but they can’t decrypt it. It is simply impossible to scan end-to-end encrypted messages for CSAM or any other form of illegal content.

Disabling E2EE would only harm the users of such services with very limited benefit to combat child sexual abuse. E2EE doesn't disappear because the EU wants messengers to disable it. You can still use many different tools that are widely available to manually encrypt your data before sending it through a messenger. Criminals will put in the extra effort to encrypt manually, whereas normal users will only suffer from the loss of privacy.

Coming back to our comparison to the real world, detection orders for E2EE messengers compare to bugging every house and performing regular house searches just because every citizen might do something illegal at home.

Although some in the EU believe that CSAM detection and E2EE can be combined, it remains a paradox. The proposal in its current form is extremely dangerous for our democracies.

Alternative solutions

It is not enough to just complain about the EU's proposal and to point out how many dangerous side effects it will bring. We want to offer alternatives that are less intrusive, but effective at the same time.

Regrading public networks, we think it is reasonable to demand moderation of the uploaded content. Users should be able to report and platforms should take action quickly. Only if that turns out to be not enough, additional measures should be considered.

On the other side, E2EE messengers shouldn't be the target of this proposal. Private conversations should remain private, just like when we have a face-to-face conversation. However, it must be possible to report users that upload illegal content. Also, educating both parents and children about the risks of using online services is crucial to address grooming. This will not only protect children but also help to report those who groom.

Further, there's not only an issue with CSAM and grooming. Children face a wide range of inappropriate content when browsing the internet. We believe children must be protected from violence, pornography and other inappropriate content. Therefore, we keep working on our specification as a solution that protects children and privacy at the same time.

Sources

  • The proposal
  • Definition of "hosting provider": on a Single Market For Digital Services (Digital Services Act) and amending Directive 2000/31/EC, Article 2

We’ve done our best to summarize the proposal as accurate as possible. In case we got something wrong, do not hesitate to contact us!