Litigationblog

Basic Law for the digital space? – What will change with the new Digital Services Act?

The new Digital Services Act (DSA) is intended to make the digital space more legally secure in the future and to regulate it according to international standards in a user-friendly way. Negotiators from the EU member states have agreed on this after lengthy negotiations. In essence, the EU Commission is now obliging Internet companies to take more effective action against incitement to hatred, violence, and disinformation, in order to be able to supervise the digital space more strictly and thus protect consumers more effectively. EU Commission President Ursula von der Leyen predicts, “Our new rules will protect online users, ensure freedom of expression and open up new opportunities for businesses.”

Who does the DSA’s regulations affect?

In principle, the DSA applies to all digital service providers in the EU. The extent to which service providers are regulated depends on what the digital service providers use their users’ personal data for and what their reach is. They are categorized as intermediary service providers (data is transmitted), hosting services (data is transmitted and stored) and platforms (data is transmitted, stored, processed, and selected). The strictest rules apply to platforms, including social networks such as Facebook or Twitter. In addition, the platforms are subdivided once again into reach. If a platform reaches more than 10% of the people living in the EU (that is at least 45 million users), it is subject to particularly strict rules and supervision. This will particularly affect Google (with YouTube), Microsoft, Meta (with Facebook and Instagram), Amazon, LinkedIn, Twitter, and Apple.

What does the Digital Services Act regulate?

Removing illegal content

In accordance with the principle “What is illegal offline should also be illegal online,” the EU Commission wants to curb false information and violent statements in the future, while respecting freedom of expression. To this end, digital providers are now to assume responsibility for the corresponding offending content (terrorist propaganda, hate speech, counterfeit products, etc.) and delete it in the first instance as soon as they are informed about the content. Both users and EU member states can bindingly demand the deletion of certain content. The EU Commission has set 24 hours as a benchmark. In the event of repeated misconduct, digital service providers should also be allowed to block users. If, on the other hand, the provider deletes something inadmissibly and with insufficient justification, users can challenge the provider’s deletion decisions and possibly be compensated.

Transparency and freedom of choice

Transparency is also set to improve in the digital space. In the future, minors will no longer receive personalized advertising, while sensitive data such as health information or political and sexual orientation will henceforth be excluded from use for personalized advertising offers. Subtle digital manipulation through so-called “dark patterns” is also to be restricted. These are design practices that lead to certain decisions by emphasizing one of several options. Also, misleading user interfaces, such as cookie selection on websites, are to be largely prohibited.

Provider review

For online digital marketplaces (e.g., eBay), the new digital law will now make it mandatory for online marketplace operators to audit the sellers of products on the online marketplace to prevent the sale of counterfeit products.

Risk analyses

At least once a year, digital providers are also to conduct investigations into how their services affect the public. At the request of the EU Commission, the company can also be required to conduct such risk analyses independently of the annual analysis. The analyses are then to be reviewed and evaluated by experts. In line with the risk assessment, measures can then be decided to improve the security of the digital space. The background to this is the current practice of the top digital corporations to conduct such investigations and risk analyses, such as Facebook finding evidence that Instagram was promoting eating disorders among young people, but then not publishing this and not taking any measures to improve the situation.

Crisis mechanism

The introduction of a crisis mechanism, which would enable the EU Commission to control the effects of manipulations on the network in crisis situations such as war, pandemics, or terror, is completely new and has also been controversial up to now. The very large services (such as Facebook, Twitter & Co.) may then be obliged to pass on information to supervisory authorities and experts.

What happens in the event of a violation of the Digital Services Act?

If a company violates the Digital Services Act, for example by not deleting illegal content, it could face fines of up to 6% of the company’s global turnover. What seems little at first turns out to be a considerable sum: for a company like Facebook, such a fine could amount to up to 7 billion dollars.

How will the new digital law be implemented?

First of all, it will have to be clarified what is to be classified as illegal content. The EU emphasizes that only illegal content should be deleted, not content that is harmful but still under the protective umbrella of freedom of expression. The Digital Services Act does not specify how this is to be delimited in concrete terms. According to general rules, freedom of speech does not protect: Insults, libel, and slander.

In practice, the following has been established so far: In order to control the implementation of the Digital Services Act, the large digital companies are to grant the EU Commission and scientists access to their data. In the case of smaller companies, authorities in the respective country in which the company has its headquarters are to be responsible for this. For Germany, it has not yet been determined which authority will take on this task. The Federal Network Agency or the state media authorities are possible candidates. The EU is also setting up Digital Services Coordinators in each EU member state to monitor enforcement of the rules.

The shaping of the legally secure digital space is thus likely to take the form of a convergence in which users and member states point out illegal content and thus increasingly sharply delineate what is meant by “illegal content”, to which digital companies will have to respond and adapt.

A turning point in the history of Internet regulation?

It is unlikely that illegal content will disappear from the digital space. However, with the Digital Services Act, a supranational framework, has now been put in place that makes digital corporations more accountable and requires them to take measures to curb incitement of violence and disinformation. One fundamental problem will remain, however: There is still no clear outline of what constitutes illegal content. Before the courts can ultimately decide on this, it will initially be up to companies to interpret the boundaries of freedom of expression. The DSA is expected to come into force by January 01, 2024, at the latest, once both the EU Parliament and the Council of EU member states have given their consent. It will then also be necessary to evaluate the practical significance of the law and how effectively it makes the digital space safer.