There is news for social media network providers operating in the European Union regarding prevention of hate speech and crimes: Austria enacted a law against hate and crime on social networks, the Communication Platform Act (KoPl-G). Following the German Network Enforcement Act (NetzDG), both laws are intended to make the deletion procedure simpler, more transparent and shift responsibility to the social network provider. A unified European Law, the Digital Service Act (DSA), could soon replace these local country rules.
1. The German Network Enforcement Act
The German Parliament just recently passed the law amending the NetzDG which involves some changes for social networks providers. The NetzDG, enacted in 2017 in Germany, was the first in Europe to go against hate speech and crimes on social networks (more about the provision of the NetzDG on our previous blog).
The newest amendment, which was first proposed in April 2020 (more on our previous blog) contains the simplification of the reporting channels for the complaints procedure and added information obligations for half-yearly transparency reports of the platform operators. A direct right to information against the platform operator shall be created in the Telemedia Act (TMG) for victims of illegal content in networks. The amendment for the NetzDG provides that the user may request a review of the platform provider’s decision to remove or retain reported content and has a right to have the content restored. This shall prevent the so-called “overblocking”, i.e. when legal content is removed, and strengthen the freedom of opinion of users. The network provider is now obligated to obtain comments from concerning parties and give individual reasons for each decision. Video sharing-platforms are also subject to the NetzDG according to the new Sec. 3 (e) NetzDG but only in case of user-generated videos and broadcasts.
2. The Austrian Communication Platform Act
On April 1st 2021, the KoPl-G came in force in Austria. The provision of the KoPL-G seems to be inspired by the German NetzDG.
Domestic and foreign communication platforms that have more than 100,000 users in the last year and more than 500,000 EUR sales fall into the scope of the Act. Platforms that only provide sales and agent services for goods or real estate that are created by media entities with journalistic content, educational platforms and online encyclopedias are excluded. An exception has been made for video platforms regarding broadcasts and user-generated videos.
New obligation: Review & remove
Like the NetzDG, the KoPL-G requires easy to find, permanently available and easy to use functionalities to report illegal content. The content shall be deleted within 24 hours if the illegality is obvious to a legal layperson. Where the illegality of the content can only be identified by a detailed review, the provider of the social network shall remove the content no later than seven days after finishing the review. Content is unlawful if it constitutes a criminal offence according to Austrian Criminal Code including stalking, persistent harassment, unauthorized image recording and pornographic presentations of minors.
Complaint procedure against “overblocking”
The user that reported the content as well as the reported user shall be informed about the decisive reasons of removing content to ensure the transparency of the process. The KoPl-G requires further that platform providers offer a transparent review proceeding relating to the decision of removing or retaining content. In order not to excessively restrict the users’ freedom of opinion, Sec. 3 (4) KoPL-G gives the reported and the reporting user the possibility to have the platform’s decision on (non-) deletion reviewed again.
Duty to appoint a local representative
Social networks are obligated to appoint a representative for official and judicial service. Unlike the German NetzDG, the KoPL-G involves appointing a German-speaking responsible officer residing in Austria that has the required competences and resources to ensure compliance with the Act. This officer needs to be a natural person whereas the representative can also be a legal person. It is also possible that one person is responsible for both positions.
To dos for social media operators
Social network providers should act soon to include the required functionalities on their websites, and appoint a service agent as well as a representative. The supervisory authority is entitled to review if the obligations are complied with, otherwise a fine can be imposed.
To date Germany and Austria are the only EU-member states that adopted provisions against hate and crime on social media platforms. France actually issued an Act against hate and crime on social media platforms, but the Conseil Constitutionnel deemed it unconstitutional due to incompatibility with freedom of opinion. However, a European Act, the Digital Service Act (DSA), could soon replace all local rules in the member-state. In December 2020, the European commission published a proposal that deals with the procedure for social networks in case of illegal content and intends to prevent a fragmentation of acts across Europe. When the proposal is adopted, the DSA would be primarily applicable and apply to all member states.
Another new development on the legal situation for hosting service providers in Europe refers to combating terrorist content. Recently, the European Parliament adopted a regulation for quick and smooth removal of terrorist content on online platforms or websites. Hosting service providers must remove or disable access to content that incites criminal offences of the Directive of combating terrorism (EU) 2017/541). The proceeding is as follows: The competent authority of the member state will inform the service provider of any terroristic content. Within one hour after receiving the notice, the content must be deleted or made inaccessible in every EU member state. Providers can have the orders reviewed. If there is no investigation in place, the service provider is obligated to inform the user about the removal. Monitoring or reviewing content in general is not required, however online platforms must act fast after receiving the removal order. The regulation is applicable to hosting service providers where the user can publish posts on a platform or website. This means not only social media platforms are subject but also websites with a functionality for comments. Hosting service providers have less than a year to prepare for this new regulation, it will come into force on 7 June 2022.
5. Code of Conduct on countering illegal hate speech online
The Code of Conduct on countering illegal hate speech online provides for the same as the rules mentioned above. Five years ago, major IT companies signed this official Code of Conduct according to which they commit to screen illegal content and to delete it within 24 hours. The European Commission published an annual evaluation on the Code of Conduct showing that 90 percent of the notifications were reviewed within 24 hours and 71 percent of the content was removed. This is a step forward in combating illegal content on online platforms.