Why Meta needs to implement these Oversight Board recommendations

By Maria Mingo

3 April 2024

On 26 March 2024, the Oversight Board published a policy advisory opinion that, if properly implemented by Meta, would be a significant step towards freedom of speech for Arabic-speaking and Muslim communities. The change would increase the transparency of vague content moderation policy and reduce over-removals of Arabic-speaking content, a move which would facilitate human rights investigations and memorialisation. 

Mnemonic welcomes the opinion and supports the Board’s recommendation for Meta to end its blanket ban on the Arabic word “shaheed”, commonly translated to “martyr”, and increase transparency of its Dangerous Organisations and Individuals (DOI) policy. Noting that the term likely accounts for more content removals under its community standards than any other single word or phrase on its platforms, we hope Meta’s response will help counter over-removals and increase the availability of real-time conflict documentation – critical for human rights investigation and memorialisation. This is particularly urgent for Arabic-speaking countries where Meta’s underinvestment in local staff and language is compounded by the use of automation. 

The Board’s conclusion is that Meta’s current moderation approach of removing all content using the term is overbroad and disproportionately restricts freedom of expression. This is in line with calls from civil society organisations (CSOs), including Mnemonic, who have for a long time criticised Meta’s DOI policy. Most recently, 19 CSOs re-launched the Stop Silencing Palestine campaign calling on Meta to, among several requests, overhaul the opaque and vague policy, and increase transparency regarding any content guidelines or rules related to the classification and moderation of terrorist content under this policy. 

Mnemonic sees this opinion by the Oversight Board as a positive step in the right direction and we hope for full implementation and similar evaluation of other terms beyond “shaheed”.

Over-removals are a problem for human rights investigations

With vast volumes of images uploaded daily from conflict zones to social media platforms, vast volumes of potential evidence are also removed by platforms. Some content may be rightfully removed; however, a large amount of content is over-removed due to opaque and vague content moderation policies such as the DOI policy, automation coupled with insufficient language or context expertise, or inconsistent newsworthiness allowance application. 

Over-removals violate freedom of speech by censoring activists and affected communities. They also reduce the availability of real-time information emerging from a conflict. CSO research conducted on Meta is crucial to assist accountability efforts by law enforcement agencies. If content is wrongfully removed, often through automation before anyone can even see it, open-source human rights investigations will be jeopardised.

Anticipated outcomes of the Oversight Board recommendations

1. Policy clarification

Particularly important to help reduce over-removals of content including the word “shaheed” is the Board’s recommendation for Meta to clarify its DOI policy and internal guidelines to include examples of violating content and to only remove content referring to a designated individual as “shaheed” when it:

  • Is coupled with one or more of three signals of violence: An image of an armament/weapon, a statement of intent or advocacy to use or carry an armament/weapon, or a reference to a designated event; or
  • Otherwise violates Meta’s policies (e.g. for glorification or because the reference to a designated individual remains unclear for reasons other than use of “shaheed”).

While still maintaining the possibility of over-removals (especially of potential command structure evidence where armed groups may refer to their members as martyrs), this new guideline nevertheless indicates notable improvement to the status quo.

2. Designation procedure transparency

In line with CSO long-standing demands, the Board further recommends Meta increase transparency by providing more detail on the procedure for designating entities and events under its DOI policy, and publish aggregated information on the total number of entities of its designation list and the number of added/removed entities over the past year. Additionally, Meta is asked to introduce an effective process for regularly auditing designations and ensuring the entity list is kept up to date and does not include organisations, individuals and events that no longer meet Meta’s designation definition. These changes would finally provide insight into an opaque policy as well as help reduce discrimination and censorship of online voices.

3. Moderation and automation accuracy

Another important recommended change is for Meta to clearly explain in its transparency center how it uses classifiers to generate predictions of policy violations and how it sets thresholds for either taking no action, lining content up for human review, or removing content. In the opinion, the Board further asked Meta to explain its methods to assess human review accuracy and the performance of automated systems in the enforcement of its DOI policy, and publish the outcomes periodically in a way that can be compared across languages and/or regions. This recommendation is crucial as automation is at the core of over-removals, causing key content that could potentially be used as evidence for human rights violations to disappear before anyone can see it. 

We hope Meta will implement all of the Oversight Board’s recommendations to reduce over-removals that risk violating online freedom of speech, limiting the availability of real-time information emerging from conflict, and hampering open-source human rights investigations and conflict memorialisation.


About Mnemonic

Mnemonic is the umbrella organisation for the Syrian Archive, Yemeni Archive, Sudanese Archive, Ukrainian Archive, and Rapid Response projects including the Iranian Archive. We create searchable databases of open source information related to human rights violations to help memorialise conflicts, raise awareness of situations, and investigate human rights violations and atrocity crimes. To date, we have preserved over 15 million items related to alleged human rights violations from social media platforms. Our comments on this case are based on 10 years of experience preserving human rights content from social media platforms and conducting open source investigations. Our Archives are run by staff from the relevant countries and informed through our work with CSOs, activists, and journalists from those countries.


Mnemonic is an NGO dedicated to archiving, investigating and memorialising digital information documenting human rights violations and international crimes. Mnemonic also provides trainings, conducts research, engages in content moderation advocacy, and develops tools to support advocacy, justice and accountability.

aboutcontactpressprivacy policyimprint