Mnemonic to EU legislators: step up and safeguard human rights online
By Dia Kayyali
Last week, the Digital Services Act Human Rights Alliance (DSA Alliance) published a statement on the European Union’s pending Digital Services Act. Mnemonic is a member of the DSA Alliance along with 10 other organisations. We’re pleased to be a part of this effort to bring a global perspective to ongoing EU policy making because, as the statement notes: “Although these standards will influence platforms’ operations far beyond the Union, no attempt has been made by the EU Commission to consider the needs of—and risks to—vulnerable groups or marginalised communities around the world.”
As an EU-based organisation led by people from the Southwest Asia/North Africa (aka MENA) region, we are in a unique position to understand and contextualise these risks. We’ve seen directly the impact of content moderation policies from the United States and EU on documentation of human rights abuses, and it’s not good. Improper takedowns of human rights-related content have long been an issue, but we first started seeing the rapid destruction of vast swathes of human rights documentation on YouTube in 2017. That was the same year the European Commission issued its communication on “Tackling Illegal Content Online-Towards an enhanced responsibility of online platforms,” which encouraged the use of machine learning and upload filters to ensure rapid detection and removal of content — without requiring a decision by a court that content is in fact illegal.
We strongly suspect that the threat of legislation factored into Google’s decision to implement machine-learning. Ironically, the voluntary adoption of this flawed tool by Google, as well as by Facebook and other major platforms, wasn’t enough to stop the Terrorist Content Online Regulation, a deeply flawed regulation formally adopted by the EU in April of 2021.
Furthermore, as a German organisation, we are pleased that the DSA Alliance statement reflects how the German NetzDG law has had negative ripple effects throughout the Internet ecosystem, encouraging dangerous copycat laws (most recently in Canada). By the end of 2019, at least 13 countries had passed such copycat laws — the number now is almost certainly higher. Given Germany’s stature in the EU, German politicians in particular must heed the asks of the statement:
- Avoid legally mandated strict and short time frames for content removals due to their detrimental impact on the right to freedom of expression and opinion.
- Do not impose legally mandated automated content moderation tools on online platforms, as this will lead to over-removals of legitimate speech.
- Avoid shifting states’ obligations to protect individuals’ rights to privately-owned online platforms, thus allowing them to act as quasi-judicial bodies in the online ecosystem without any public scrutiny.
- Do not impose mandatory reporting obligations to Law Enforcement Agencies (LEAs), especially without appropriate safeguards and transparency requirements.
- Prevent public authorities, including LEAs, from becoming trusted flaggers. Conditions for becoming trusted flaggers need to be subject to regular reviews and proper public oversight.
- Preserve the current conditional model of intermediary liability, as established by the E-Commerce Directive.
- Consider mandatory human rights impact assessment as the primary mechanism for examining and mitigating systemic risks stemming from platforms’ operations. Mitigation of risk measures should play a secondary role only.
We urge the EU to take its role in Internet legislation extremely seriously. To the European Commission and Parliament we ask: where exactly are you leading us? We fear the DSA may be leading us away from the Internet as a free, open, human rights-respecting space in service of massive — yet ineffective — censorship of all apparently objectionable material.
For more information please view the full statement