logo
logo

NEWS

It's not too late for Members of European Parliament to vote no on the disastrous "Terrorist content" regulation

by Dia Kayyali

    Table Of Contents

  • TCO regulation

Mnemonic proudly joins 61 human rights organisations, journalists associations and research institutes to urge Members of European Parliament to vote no on the European Union’s proposal for a Regulation on addressing the dissemination of terrorist content online (TCO regulation). Essential content uploaded by documenters of human rights abuses and nontraditional media houses in conflict zones is teetering on the edge of destruction. Policies like the TCO regulation threaten to push these huge collections of human rights documentation over the edge into oblivion. That’s hundreds of thousands of videos that could be used as evidence in, for example, prosecutions human rights abuses by members of the Syrian government for human rights violations.

The TCO regulation will have its final vote in the plenary of the EU Parliament in April. We’re making sure lawmakers know that they can still stop this disastrous legislation with a NO vote.

TCO regulation

The TCO regulation is meant to address “the misuse of hosting services for terrorist purposes and [to] contribut[e] to public security.” Instead of making sure it’s actually contributing to public security by funding research into the relationship between online content and offline violence, the regulation forces companies to deploy the blunt tool of content moderation for an extremely complicated societal problem.

Some improvements have been made to the final regulation in negotiations- partially thanks to incredible efforts from civil society groups that have spent several years now pushing for improvements to this regulation. The letter highlights three outstanding issues:

1. The proposal continues to incentivise online platforms to use automated content moderation tools, such as upload filters 2. There is a severe lack of independent judicial oversight. 3. Member States will be able to issue cross-border removal orders without any checks

The regulation creates new requirements for “hosting service providers,” which of course includes social media platforms such as Facebook, as well as storage sites like Dropbox, and countless small platforms. It requires them to “remove terrorist content or disable access to terrorist content in all Member States as soon as possible and in any event within one hour of receipt” of removal order from a “competent authority” of an EU Member State.

These orders can be issued across borders, meaning a robust democracy like Germany could be subject to removal orders from an ailing democracy like Hungary- what one political cartoonist has called the “Orbanisation of the Internet.” As MEP Patrick Breyer points out, “[t]he fact that Victor Orbán will be able to have digital content deleted throughout the EU opens the door to politically motivated internet censorship – especially since the definition of terrorism is alarmingly broad and susceptible to abuse.”

Of particular interest to us at Mnemonic is the requirement in the regulation that every hosting service provider “take specific measures to protect its services against the dissemination to the public of terrorist content.” The regulation lists some examples, including technical means and user-flagging. These “specific measures” will play out in content moderation tools and policies, and of course companies aren’t willing to devote unlimited resources to terrorist content. They’re going to use those “technical means”, such as machine learning algorithms and upload filters, wherever possible, as they are more cost-effective. They’re also particularly error prone- and those errors have a real cost.

A human face We’ve had our eye on this regulation since we signed a January 2019 letter highlighting the real impact of efforts to eradicate poorly defined “extremist content.” As we noted in that letter:

after Google instituted a machine-learning algorithm to “more quickly identify and remove extremist and terrorism-related content” in June of 2017, hundreds of thousands of videos went missing.[3] This included not only videos created by perpetrators of human rights abuses, but also documentation of shellings by victims, and even videos of demonstrations.

Nearly four years since Google started using machine-learning for content moderation, myriad other platforms have followed suit. At the start of the COVID-19 pandemic, platforms also greatly increased their use of automation, sometimes with nonsensical results (for example, for a brief period, Facebook’s AI was removing pandemic-related fundraisers).

All of this has resulted in erroneous removal at an ever-increasing pace. A steady trickle of community members reach out to us asking for help reinstating accounts or content on a regular basis, and civil society in the “Middle East/North Africa” region now views content removal as an endemic issue. Even with the work of ourselves and others on this issue, large chunks of digital memory have been irrevocably deleted.

More than the EU at stake Mnemonic is based in the EU. That means we have the opportunity to lobby for regulations that help, instead of harm, human rights and human rights documentation in the “MENA” region. Of course, many of the human rights defenders who we work with are based outside of the EU. We are painfully aware that regulation of the Internet in the EU impacts all of them- as well as people all over the world, from Myanmar to Mexico. We know that MEPs are not charged with ensuring the rights of non Europeans, but we hope that they would take those rights into consideration. Unfortunately, as today’s coalition letter states:

[T]he final text of the proposed Regulation still contains dangerous measures that will ultimately weaken the protection of fundamental rights in the EU. It also has the potential to set a dangerous precedent for online content regulation worldwide.”

This can’t be overstated. When the EU or EU member states regulate the Internet, tech companies do what they need to do in order to comply. Sometimes these changes are localised and relatively minor- for example, the German NetzDG law necessitates a different reporting flow on Twitter for users in Germany. This can be applied using IP addresses or self-reported location. But it’s unlikely that structural changes such as putting effort into increased automated detection of certain types of content will be localised. In other words, EU requirements could impact the technical side of content moderation for the whole globe.

Furthermore, aside from any platform response, the fact is that when the EU passes legislation, it gives authoritarian governments cover to pass their own legislation- for example, Turkey recently passed a [draconian new regulation of social media](draconian new regulation of social media) that critics have called out for enabling censorship in the increasingly authoritarian country. As Jillian York of EFF points out, “[w]hen introducing the new law, Turkish lawmakers explicitly referred to the controversial German NetzDG law.”

We hope that all of these issues will give MEPs pause when the TCO Regulation comes up for vote next month. We hope they can understand that security doesn’t come from silencing human rights defenders. Regardless of how the vote plays out, we will keep fighting for laws that ensure digital memory and evidentiary content is not erased without a trace.

You can read the full civil society letter here.

logo

Mnemonic is an NGO dedicated to archiving, investigating and memorialising digital information documenting human rights violations and international crimes. Mnemonic also provides trainings, conducts research, engages in content moderation advocacy, and develops tools to support advocacy, justice and accountability.

Donate
aboutcontactpressprivacy policyimprint