Harmful and potentially illegal content available on digital platforms after the attack

A report on the prevalence of harmful or potentially illegal content on digital platforms following the Bratislava terrorist attack

Considering the findings of previous reports on the role of digital platforms in the Zámocká street shooting, it is clear that the platforms’ neglect of content moderation has led to a significant proliferation of harmful content online. To investigate this issue further, the Council for Media Services (CMS) conducted a study, in liaison with Trust Lab, on the prevalence of harmful and potentially illegal content available on four major digital platforms (Facebook, Instagram, Youtube and TikTok) following the attack. The study also serves as a preliminary attempt to assess the due diligence obligations concerning the management of systemic risks. 

The study measures the findability of content featuring hate speech, harassment based on protected categories, or violent extremism, as well as the frequency and timing of social media platforms’ actions on that content. Trust Lab identified in just 120 hours 253 unique links of harmful and potentially illegal content, of which 123 were viral posts garnering thousands of views. The identified content was consequently analyzed in light of the Slovak Media Services Act and the relevant EU legislation. 

The findings of the report reveal blatant neglect of content moderation as the platforms took action only on 12 pieces of content reported by Trust Lab using the user content reporting mechanisms. However, when notified of 26 pieces of potentially illegal content by the CMS, the platforms reacted swiftly and removed all the content reported by the national regulatory authority. Besides the absence of any meaningful reaction on the platforms’ behalf, the report highlights the increasing prevalence of borderline content which poses a formidable challenge to the ongoing regulatory efforts.

To improve the situation, the CMS puts forward a number of recommendations for the analyzed platforms. 

  1. Improve the effectiveness and transparency of the user content reporting mechanisms. 
  2. Develop user-friendly interfaces fostering data portability and compliance monitoring. 
  3. Improve cooperation with the research community regarding policy reviews and develop guidelines for borderline content.
  4. Develop guidelines for responding to crisis scenarios.

Calendar of events

Monday Tuesday Wednesday Thursday Friday Saturday Sunday
1
 
2
 
3
 
4
 
5
 
6
 
7
 
8
 
9
 
10
 
11
 
12
 
13
 
14
 
15
 
16
 
17
 
18
 
19
 
20
 
21
 
22
 
23
 
24
 
25
 
26
 
27
 
28
 
29
 
30
 
1
 
2
 
3
 
4
 
5