One Year of Policy Advocacy
By Dia Kayyali
One year of policy advocacy
2021 has been quite a year for the whole world, and here at Mnemonic it was no different. Content moderation was in the news constantly, and so were we! It was our first full year of having a dedicated policy advocacy staff person, and we increased capacity across the areas we work in - advocating with tech companies, facilitating and contributing to coalitions and multi-stakeholder forums on content moderation, policy advocacy with lawmakers, and advocacy and research on the issue of human rights documentation and content takedowns on social media platforms.
As we said in October, “as an EU-based organisation led by people from the Southwest Asia/North Africa (aka MENA) region, we are in a unique position to understand and contextualise” the impact of content moderation related policies, because we’ve seen the impacts of content moderation policy firsthand on our own collections of human rights documentation. Importantly, we can also provide perspective on the global impacts of policies made in the “west”. The Internet is like the environment; what powerful corporations or one country does impacts everyone. We believe that big companies like Google and Facebook and wealthy countries like Germany and the United States have the moral obligation to take this into account in policy making, and that’s what we’re pushing for at the national and international level.
We’re looking back on this year and seeing how much all the disparate areas we work in really overlap, and looking forward with an eye to how we can be more strategic next year!
The learning curve for how to engage in European Union policy-making processes is steep, but as 2021 comes to a close we’ve come a very long way. Thanks to our friends at Access Now and EFF, who led the Digital Services Act Human Rights Alliance, we’ve gotten involved in the Digital Services Act, which has important provisions about content moderation and transparency. We contributed to a DSA Human Rights Alliance statement, and followed it up with a letter directly to policymakers before the final committee vote in the European Parliament. You can read more about our DSA position here. We will devote a lot of our time in 2022 making sure that it doesn’t get watered down in trilogue negotiations, and pushing for improvements.
The DSA isn’t the only legislation moving forward. We haven’t been able to get involved in advocacy around the Digital Markets Act, but we are watching it closely. We also provided input to the Council of Europe\‘s Ad Hoc Committee on AI (CAHAI), and are hoping to get more involved in the European AI Act as it moves forward.
We have been working and will continue to work to build our European Union network in 2022. We’ve been able to support statements from allies this year on the AI Act, and have had the opportunity to strategize with our fellow European AI Fund grantees, but we’re looking to join more formal and informal coalitions.
We’ve taken the lead in two multistakeholder forums focused on the issue of so-called “terrorist and violent extremist content” (TVEC), The Christchurch Call to Eradicate Terrorist and Violent Extremist Content and the Global Internet Forum to Counter Terrorism (GIFCT). Unfortunately, the vast majority of human rights documentation that is removed from social media platforms is taken down under the auspices of TVEC. These forums heavily influence corporate policies and cross-border agreements on data sharing and emergency response. Representatives of myriad governments participated, from the European Commission to the US State Department, consolidating necessary policy discussions. Increasingly, law enforcement also participates in these forums as key players in the regulation, and at times policing, of digital content. As an Arab-led organization with regional, EU, and US experiences, we want to ensure that content moderation does not become an unquestioning tool of broken domestic and international counter terrorism policies that criminalize us and our communities.
The Christchurch Call was initiated by the governments of New Zealand and France after the livestreaming of a horrific attack on a New Zealand Mosque that left 51 people dead. It is meant to address the online component of terrorism, but also includes specific commitments to human rights and a free, open, and secure internet. We served as the Christchurch Call co-chair until September and are still engaged as members. You can read about the Call at it’s two year anniversary here. You can read our take on the two year anniversary here. The Call has been a fascinating experiment in bringing together heads of state with companies and civil society organisations to try to deal with a very real issue, and while there are still issues with civil society participation, we’re excited to note that it just hired its first staff person and will be pushing for ways to make the consultative function of the Network more meaningful.
The GIFCT started as an industry initiative nearly 5 years ago, but has since reorganized as its own NGO that works largely through multistakeholder working groups. Much of the Christchurch Call is being carried out through these GIFCT working groups. We’ve been pushing for the GIFCT to become more accountable and transparent, and to take seriously the human rights abuses committed in the name of fighting terrorism. We’ve had some success, but the GIFCT remains mostly unaccountable to civil society and even to governments. We were instrumental in the GIFCT Human Rights Impact Assessment conducted by BSR earlier this year. We pushed BSR to make strong recommendations, and were pleased to see many of our specific suggestions included in the final product- for example, a system where the Independent Advisory Committee of the GIFCT (which is currently largely powerless) can make formal recommendations to the Operating Board (which can actually make decisions), and the Operating Board must respond.
We’re now the civil society sector co-facilitator for the GIFCT’s Legal Frameworks working group, where we are exploring the issue of how to preserve and access human rights documentation that gets removed in the rush to eradicate “TVEC,” while also respecting privacy. This is a particularly meaningful place to have this discussion, since it brings together so many governments and companies in one place and avoids the problem of having to balance multiple interests in separate conversations. You can read the Working Groups’ 2020-2021 report, which we contributed to, here.
We’re also participating in the Crisis Response Working Group this year because it is in a crisis that rights are so often thrown by the wayside, and we want to ensure that this does not happen. This group also has overrepresentation of law enforcement, and very much needs civil society voices to push back on the replication of broken “national security” frameworks.
Mnemonic has been consistently engaging with companies since 2017, right after Google started using machine-learning algorithms to detect “TVEC.” We have consistently found that when companies get bad press, they’re more willing to engage with civil society and make at least small changes. Unfortunately, YouTube no longer engages with civil society organisations in a meaningful way- most likely because they are no longer the focus of negative press attention. Over the past year, we’ve engaged the most with Facebook, as it scrambled to respond to press about its negligence in conflict areas, its willful failure to properly moderate Arabic language and especially Palestinian content, and of course the massive dump of documents from former employee Frances Haugen (the #FacebookPapers).
We helped lead the #StopSilencingPalestine campaign around the massive takedowns in Palestine in May and June 2021. Since that campaign, we’ve been meeting regularly with Facebook to discuss policies like the Dangerous Individuals and Organizations list that, though related to Palestine specifically, impact the entire region. We also submitted comments in several Facebook Oversight Board cases, and in every case saw our suggestions integrated into the Board’s case decision - for example, we commented on a case about removal of Palestinian content, reiterating our ask on the Stop Silencing Palestine campaign that Facebook conduct an audit of it’s content moderation practices in Palestine. The Oversight Board agreed, and recommended that Facebook contract with an independent organization to conduct a review. That review is currently in progress. .
Coalitions and networks
As noted, we played a leading role in multi stakeholder forums. That extended to regional coalitions and international networks this year. We continue working with an informal coalition to try to think through principles for preservation and access of human rights documentation on social media (“evidence locker.“) Finally, we’ve been helping to coordinate a monthly content moderation call that brings together advocates from dozens of countries and organizations to strategize and share information.
We believe 2022 is going to be a big year. We’re going to up our focus on legislative work and coalitions, while maintaining our company relationships and multistakeholder forum participation. We’ve learned a lot about how to function in this space as a small but forceful organization, and it’s just going to get better!