logo
logo

NEWS

Mnemonic Joins Open Letter Calling on Social Media Platforms to Improve Practices Globally

By Dia Kayyali

Mnemonic is proud to join 31 organizations in an open letter calling on platforms to better improve their preparations for crises- not just in the US and Western Europe, and not just when politicians and media are paying attention. We coordinated this statement as part of our ongoing efforts to ensure that impacted communities and people are center stage in platform accountability and content moderation policy advocacy. Last week we sent the letter to Google, Meta, Telegram, Tik Tok, and Twitter.

The letter outlines seven steps platforms could take to improve, including:

  1. Real human rights due diligence
  2. Equitable investment
  3. Meaningful engagement
  4. Linguistic equity in content moderation
  5. Increased transparency
  6. Clarity about so-called “Terrorist and Violent Extremist Content” (TVEC); and
  7. Multi Stakeholder Debriefs.

We invested time in this open letter because it brings together in one place many of the issues raised by global civil society in recent years. And of course, Mnemonic has experience struggling with broken platform policies that erase our digital memory while allowing government propaganda to flood social media.

The open letter describes the difficulties civil society groups have faced when engaging with platforms. Ukrainian civil society (including signatories to our statement) also sent a statement to Meta last week, in which they called on the company to enhance “cooperation with the local civil society and media organizations possessing necessary expertise.” From mistaken takedowns of important political expression and human rights documentation to dangerous coordinated and state-sponsored disinformation campaigns, signatories of this letter have dealt with a wide range of platform failures. Civil society is often only able to get traction when our problems are highlighted by the media. This letter also points to the problem of extractive relationships in which platforms get valuable information from civil society without reciprocation.

One detail the statement doesn’t include that must be pointed out is the variation in civil society engagement from platform to platform. Google, Meta, Telegram, Tik Tok, and Twitter each have very different responses to civil society. Platforms that get a lot of media attention, including Meta and Twitter, have significant resources dedicated to a range of civil society groups. Both platforms have dedicated staff and set structures for engagement, such as the Oversight Board or Twitter’s Trust & Safety Council. This has led to incremental improvements in policies, and made it easier to understand how these platforms make their decisions. Tik Tok is hard to compare, as it is such a new company. It has the chance to learn from the experiences of the other platforms before it, and perhaps following Twitter’s example, Tik Tok established a safety advisory council last year.

Finally, that leaves Google and Telegram. These companies both deserve some (negative) attention. Google is a huge company compared to all others addressed in the letter. As the parent company for YouTube, some of Google’s resources should be directed to YouTube. Yet, unlike Meta, neither Google nor YouTube have readily available regional staff or a structure for civil society engagement. YouTube has a “Trusted Flagger” program for civil society which they claim includes “Ongoing discussion and feedback” on content, but these sessions appear to be few and far between, and the program itself doesn’t actually create an open channel for communication. In fact, Google and by extension YouTube are hard to reach and seemingly impervious to the occasional negative media. In Mnemonic’s case, this was driven home by the consistent removal of Syrian content from YouTube. For several years, this massive problem received significant mainstream media attention. We even did a New York Times Op-ed on the problem. Yet YouTube took few of our recommendations into account. In fact, there is not a single staff person at YouTube that communicates with us, nor engages regularly with civil society. Similarly, Telegram also deserves attention for not having a single dedicated staff person to engage with civil society. The platform simply ignores attempts from civil society for any discussion or dialogue, despite the increasingly visible human rights impacts of Telegram as a tool.

This statement isn’t meant to convey a consensus on the finer details of content moderation; ultimately some advocates believe platforms should leave up as much content as possible to preserve important free expression, especially from marginalized communities. Other advocates believe platforms should remove more content, especially to protect vulnerable communities. Some of us believe that advocacy should be focused entirely on the decentralization or the business models of social media companies and others focus on advocacy aimed at immediately reducing the negative impact of poor content moderation policies on affected communities. Nevertheless, there’s a broad consensus that these steps can help platforms be fairer and address some of the offline impacts of their decisions. And while this won’t change their business models, it can move us closer to the root of the problem rather than surface level policy changes often made by platforms in reaction to civil society.

published 22 April 2022

logo

Mnemonic is an NGO dedicated to archiving, investigating and memorialising digital information documenting human rights violations and international crimes. Mnemonic also provides trainings, conducts research, engages in content moderation advocacy, and develops tools to support advocacy, justice and accountability.

Donate
aboutcontactpressprivacy policyimprint