Comment on "Sudan’s RSF Video Captive" case
Submission to the Oversight Board
23 January 2024
Mnemonic is the umbrella organization for the Sudanese Archive, Ukrainian Archive, Syrian Archive, and Yemeni Archive. We create searchable databases of open source information related to human rights violations to help memorialize conflicts, raise awareness of situations, and investigate human rights violations and atrocity crimes. To date, we have preserved over 15 million items related to alleged human rights violations from social media platforms. Our comments on this case are based on 10 years of experience preserving human rights content from social media platforms and conducting open source investigations. Established in 2019, our Sudanese Archive is run by Sudanese staff and informed through our work with civil society organizations, activists, and journalists in Sudan.
This submission responds to the Oversight Board’s call for information on (1) how the Rapid Response Forces (RSF) and the Sudanese Armed Forces (SAF) are using social media to shape the narratives around the conflict, and whether Meta’s designation of the RSF as a dangerous organization has impacted access to information and the safety of people in Sudan; (2) Meta’s enforcement of its content policies for Arabic-language content about the conflict in Sudan, in particular video posts; and (3) Meta’s prioritization of content for automated and human review in conflict situations, and the principles and factors that should guide the design of operations to ensure the most harmful content is reviewed and actioned.
- How the RSF and SAF are using social media to shape the narratives around the conflict, and whether Meta’s designation of the RSF as a dangerous organization has impacted access to information and the safety of people in Sudan.
Facebook is the most widely used social media platform in Sudan, seen by many citizens as an important tool for social organization and mobilization, human rights reporting, and staying informed of political developments. It is the main platform on which citizens in Sudan upload photos and videos documenting alleged human rights abuses. However, as it is also used by the RSF and SAF themselves, many people use social media and these official accounts to understand public opinion, become aware of military actions and tactics in certain locations, and be able to seek shelter or refuge in response to military movements. Our understanding is that people in Sudan and its diaspora community also use social media to find out whether loved ones are safe or if their homes have been looted through, for example, videos of perpetrators boasting about their crimes. It also helps people understand how hostages are treated and in some cases where they are held.
Particularly important is content posted by the RSF or SAF themselves that disclose potential targeted locations. For example, content like this video which contains misinformation or is used for propaganda purposes shows approximately 200 fighters, some of whom seem to be underage, in military uniforms and cultural costumes announcing support for the RSF and readiness to fight in specific areas. The video provides important identifying information such as weaponry and command structures, and the reference to identified areas for attack make the availability of this content crucial for the security of the affected population and allow them time to prepare and seek refuge.
Other important examples of likely intended propaganda include content that also helps identify the troop movement or treatment and location of hostages and prisoners of war. While content that identifies and shows the mistreatment of hostages or prisoners of war needs to be carefully balanced against international humanitarian law, such content can be a relief to families in Sudan trying to understand what happened to their loved ones. Additionally, they serve as a warning of occurrences and violations in specific areas. One example is a video of 21 citizens lying facedown on the ground, allegedly detained by armed RSF members in what the narrator claims to be the Beka area, west of the city of Madani, in the state of Al-Jazirah, with the apparent sound of car engines in the background. This video not only showcases RSF treatment of civilian hostages, but also provides information about their locations and movements.
Other videos (e.g. 1, 2, 3, 4, and 5) allegedly filmed by the RSF themselves show the transfer, interrogation, treatment, including threat of torture, and/or location of hostages or prisoners of war often by identifiable alleged members of the RSF. The videos, if balanced correctly against international humanitarian law to ensure no additional harm comes to individuals due to the public nature of the content, can help raise public awareness and help civilians minimize risks and obtain a better understanding of the war in Sudan.
This information is under constant threat of removal through content moderation policies, especially its opaque and vague Dangerous Organizations and Individuals (DOI) policy, and related takedown algorithms. For example, the taken down official Rapid Support Forces account included posts that were taken down despite it seemingly not violating any of the community policies and providing important information to citizens about specific events and incidents affecting the security and humanitarian situation in Sudan. As Mnemonic’s Sudanese Archive rushes to preserve and verify that information for investigations, we see a significant amount of content removed by the platform - often through automation long before anyone can see it, let alone archive it.
Of the now taken down content that we were able to archive, one example posted by the RSF on their main Facebook page includes a video of individuals in RSF uniform showing the alleged destruction of vital facilities and the burning of factories, which they claim were bombed by SAF warplanes. As there is no available footage on Sudan, such videos are crucial to understand the level of infrastructure damage in affected areas.
As highlighted in a recent joint campaign supported by 20 civil society organizations, we urge the Board to recommend that Meta overhaul its DOI policy, and introduce transparency regarding any content guidelines or rules related to the classification and moderation of terrorist content under this policy. Specifically:
- Publish the full list of individuals and organizations designated under the DOI policy, as requested repeatedly by the Oversight Board, so that users can understand how the policies are applied to their content.
- Develop and publish a clear policy on how Meta designates and de-lists individuals and organizations, and publicly disclose government requests for additions or removal from the current DOI list.
- Make clear exceptions to the existing policy when information or communication about a designated group or individual is in the public interest as per the Oversight Board’s recommendations in the “Mention of the Taliban in the new reporting” case.
- Meta’s enforcement of its content policies for Arabic-language content about the conflict in Sudan, in particular video posts.
As noted in previous submissions to the Oversight Board, in Arabic-speaking countries like Sudan where Meta’s poor understanding of dialects is compounded by the use of automation, Meta’s DOI policy contributes to over-removal. As previously submitted to the Oversight Board in case 2022-002-FB-MR and 2023-028-FB-UA on Sudan, and confirmed by POLITICO at the time, Meta tolerates an incredibly high failure rate in the Arabic speaking world. According to Meta’s own Community Standards, only clear “praise, support, and representation” should be removed. That does not mean that any mention of an organization or individual on the list is grounds for removal. In fact, as made clear by the company itself in response to Oversight Board case 2021-006-IG-UA, political discussion that is not praise, support, or representation about banned individuals and organizations is allowed under the policy. As we have documented over many years, DOI enforcement, especially when done by automated means, is a major threat to human rights documentation and Meta’s commitment to human rights.
We urge the Board to recommend Meta to:
- Address its failures in moderating Arabic-language content by hiring more Sudanese dialect experts as even within Sudan there are regional differences. Without clear, high quality training data, problems are “baked in” to machine learning processes, leading to further over-takedowns. Such takedowns are contrary to Meta’s responsibility to protect freedom of speech and access to information with respect to human rights, which has been subject to litigation against Meta.
- Meta’s prioritization of content for automated and human review in conflict situations, and the principles and factors that should guide the design of operations to ensure the most harmful content is reviewed and actioned.
All human rights content should be reviewed by human content moderators given the sensitivities, nuance, and importance of such content. From a policy perspective, there are several areas in need of improvement.
As mentioned above, Meta’s DOI policy contributes to over-removal, but its high error rates also applies to the application of Meta’s newsworthy allowance. Balancing the potential harm in each individual case and context with potential awareness and security benefits is imperative in Sudan. In its violent and graphic content policy, Meta references its newsworthy allowance when it comes to human rights by stating: “In the context of discussions about important issues such as human rights abuses, armed conflicts or acts of terrorism, we allow graphic content (with some limitations) to help people to condemn and raise awareness about these situations.” This caveat and allowance is crucial in the human rights context, especially to ensure that courageous activists and citizens collecting information in difficult and risky situations are not silenced and can report to the world the injustices occurring around them, and that the information they collect can be used for justice. As the Board determined in case 2022-002-FB-MR, the newsworthy allowance does not always work, seeing vast amounts of content removed and, disproportionately, very few newsworthy allowances applied.
Because of the importance of this type of content and the automation error rates, we urge the Board to recommend to Meta to:
- Monitor crisis development and flag for human review all content from specific crisis areas for the duration of the conflict. This will ensure increased consistency in applying Meta’s newsworthy allowance.
Content retention policy
Human rights content, especially that posted by perpetrators, could aid international justice mechanisms and domestic prosecutors. It can constitute linkage evidence or otherwise support legal investigations, such as by providing leads so that investigators know where to start and what to look for. This type of information can also help build cases for sanctions against specific perpetrators.
Given the role of civil society in generating public interest research in situations of human rights crises and conflict and providing key information to authorities, we urge the Board to recommend Meta to:
- Publicly clarify which actors beyond law enforcement can send a request for data retention, the type of data that is retained, how they are handling situations in which content has been algorithmically removed before it has been seen by anyone, and specificity around the criteria for retention.
- Consider creating an expert program to consult local and international civil society and communities that can help identify the accounts that are most relevant for the purposes of international justice and accountability.
Failing to address these issues can severely impact and hamper human rights investigations as every piece of human rights content can be important for bringing perpetrators of grave crimes to account.
Mnemonic remains available for further consultation at firstname.lastname@example.org.