On 26 October 2023, the UK adopted the Online Safety Act 2023, which introduces new obligations for online platforms to improve user safety online by ensuring content that is illegal and harmful is monitored and removed. We previously compared the Act in its draft form with the EU Digital Services Act here and will be updating the table soon.Continue Reading The UK Online Harms Bill becomes the Online Safety Act

There is news for social media network providers operating in the European Union regarding prevention of hate speech and crimes:  Austria enacted a law against hate and crime on social networks, the Communication Platform Act (KoPl-G). Following the German Network Enforcement Act (NetzDG), both laws are intended to make the deletion procedure simpler, more transparent and shift responsibility to the social network provider.  A unified European Law, the Digital Service Act (DSA), could soon replace these local country rules.

1. The German Network Enforcement Act

The German Parliament just recently passed the law amending the NetzDG which involves some changes for social networks providers. The NetzDG, enacted in 2017 in Germany, was the first in Europe to go against hate speech and crimes on social networks (more about the provision of the NetzDG on our previous blog).

The newest amendment, which was first proposed in April 2020 (more on our previous blog) contains the simplification of the reporting channels for the complaints procedure and added information obligations for half-yearly transparency reports of the platform operators. A direct right to information against the platform operator shall be created in the Telemedia Act (TMG) for victims of illegal content in networks. The amendment for the NetzDG provides that the user may request a review of the platform provider’s decision to remove or retain reported content and has a right to have the content restored. This shall prevent the so-called “overblocking”, i.e. when legal content is removed, and strengthen the freedom of opinion of users. The network provider is now obligated to obtain comments from concerning parties and give individual reasons for each decision. Video sharing-platforms are also subject to the NetzDG according to the new Sec. 3 (e) NetzDG but only in case of user-generated videos and broadcasts.

Addressing the detection of and removal of illegal content from online platforms represents an urgent challenge for the digital society today. However, so far, there is no harmonised and coherent approach across the European Union. On 28 September 2017, the European Commission (“Commission”) published a communication titled „Tackling Illegal Content Online – Towards an enhanced responsibility of online platforms” (“Communication”). The Commission calls for a more aligned approach as it would make the fight against illegal content more effective. An aligned approach would also benefit the development of the Digital Single Market. The Commission stresses that online platforms carry a significant societal responsibility and shall, therefore, decisively step up their actions to address this problem.

Scope of the Communication

The Communication does not as such change the existing legal framework. It rather lays down a set of non-binding guidelines and principles for online platforms to step up the fight against illegal content online in cooperation with national authorities, Member States, and other relevant stakeholders: “It aims to facilitate and intensify the implementation of good practices for preventing, detecting, removing and disabling access to illegal content so as to ensure the effective removal of illegal content, increased transparency and the protection of fundamental rights online. It also aims to provide clarifications to platforms on their liability when they take proactive steps to detect, remove or disable access to illegal content (the so-called “Good Samarian” actions).”

The Communication does not only target the detection and removal of illegal content; but it also takes into account issues arising from removal of legal content (“Over-Removal”), which may impact the freedom of expression and media pluralism. Therefore, the Commission calls for adequate safeguards which shall properly prevent Over-Removal.
Continue Reading European Commission calls for enhanced responsibility of online platforms for illegal content