logo
logo

NEWS

The Digital Services Act - we care and you should too

The Digital Services Act - we care and you should too

The Digital Services Act (DSA) is a landmark piece of legislation, and we want to make sure the European Union gets it right. On Monday, the Committee for Internal Market and Consumer Protection (IMCO) voted to move the DSA forward, nearly one year after the European Commission published its initial proposal. Though this version comes with a number of good amendments, it also retains some concerning provisions.

The DSA is meant to update the twenty year old E-commerce directive and address myriad aspects of doing business online, especially with regards to intermediaries - including the social media platforms that much of Mnemonic’s advocacy work focuses on. In 2022 the European Commission, the Council of Europe, and the European Parliament will move into trilogue negotiations around the DSA. Mnemonic, along with other civil society organisations, will continue to monitor the developments closely, and work to ensure that the final version of the DSA will safeguard freedoms, rights and safety of users.

In this article we highlight some of the most interesting parts of the DSA and their impacts:

Increased transparency, due diligence, and automated moderation

The DSA lays out specific transparency requirements that go beyond what many companies currently share in transparency reports. These added requirements include detailed transparency reporting, but also standards for communication with users in both notices and in the platforms’ Terms of Service.

Under these new standards, when companies take content down, they would have to provide a clear “statement of reasons,” that would include a lot of important information which companies don’t provide at the moment. In particular, companies will have to provide “a reference to the legal ground relied on.” Currently, when content is taken down in relation to terrorism, users are never given a specific legal reference, and many who we work with are mystified as to how their content was deemed “terrorist.”

The DSA would require companies to explain in their Terms of Service, in clear and unambiguous language, the “policies, procedures, measures and tools used for the purpose of content moderation, including algorithmic decision-making and human review.” Currently, this information is not easily accessible or understandable on most social media platforms.

The IMCO version of the DSA expands on this- it also specifically requires that transparency reporting include “the use of automated tools.” Social media platforms will have to report about the use of automation for content moderation, get specification about the precise purposes of such automation, and explain how they determine if automation is accurate.” This provision is ripe for intervention by civil society once such reports are being made. We’ve asked many times for information about how the faulty machine-learning processes used by companies to detect “terrorist and violent extremist content” are correct when civil society groups such as Mnemonic work to get content restored. This added element of DSA reporting, if it survives trilogues, will give us much more leverage.

The IMCO version includes another important addition – it requires platforms to notify users when content has been “demoted” or subject to other measures. This would punch a hole in the problem of so-called shadow banning - the demotion or suppression of content from specific users. This version would also require explanations for other measures, such as “interstitials”- a screen a user has to click on before accessing the content. A common example is a “graphic content” warning that a user has to click through to see an image or video. While these measures may avoid total removal of content, they still impact reach. This would be the first time companies would be forced to explain themselves on this level. Currently, most platforms use methods for dealing with content outside of takedowns without any notice to the users.

Special requirements for “VLOPs”

The DSA creates a separate set of rules for “Very large online platforms” (VLOPs) like YouTube that have 45 million or more monthly users. Under these rules VLOPs will be obliged to conduct risk assessments (Article 26) and to mitigate those risks (Article 27), and excitingly included an independent audit provision (Article 28). The risk assessments for Article 26 focus on threats to “private and family life, freedom of expression and information, the prohibition of discrimination and the rights of the child.”

Under article 28, external and independent audits will assess compliance with the DSA’s due diligence obligations. How this provision plays out will be particularly interesting because civil society organisations - including Mnemonic - have been pushing for this kind of oversight for years. Notably, Facebook conducted a “civil rights audit” at the behest of US civil society organisations. But audits require a very specific rubric, including clear standards against which practices are being audited. That hasn’t always been present. This provision may help push for the creation of clearer professional auditing standards for platforms.

The IMCO version again strengthened these provisions significantly. It encourages companies to consult “representatives of groups potentially impacted by their services, independent experts and civil society organisations” and makes clear that the risk assessment must address many specific issues (which have been brought up by civil society over the years) including enforcement of community standards, use of algorithms, and more. It fleshes out the risk mitigation and audit requirements, making them more specific and providing guidelines for determining whether an organisation is an appropriate auditor. The committee version also requires platforms to “ensure auditors have access to all relevant data necessary to perform the audit properly.” Considering Facebook’s lack of candor with its own Oversight Board, this is an important provision to include.

Other standout provisions

The IMCO version protects anonymity: “Providers of intermediary services shall not require recipients of the service other than traders to make their legal identity public in order to use the service.” It also includes an important requirement to include in transparency reporting “the complete number of content moderators allocated for each official language per Member State, and a qualitative description of whether and how automated tools for content moderation are used in each official language.”

How these provisions could be interpreted outside the EU

Mnemonic is part of the DSA Human Rights Alliance because “these standards will influence platforms’ operations far beyond the Union.” That doesn’t have to be a bad thing. Unlike some of the particularly harmful ideas to come out of the EU (NetzDG continues to be the standout there), the provisions of the DSA outlined above are likely to have positive rather than negative ripple effects. For example, the GDPR inspired the subsequent California Consumer Privacy Act. Perhaps these provisions will inspire state-level legislation in the US. They could also encourage countries in the Global Majority to push companies to do appropriate assessments of their impact in these markets and to operate in all the languages in that market.

It’s not all good

We share many of the concerns expressed by our allies at Access Now and EFF. In particular, we’re concerned about ongoing pushes to remove content as quickly as possible without judicial intervention, potential increased retention of people’s personal data and law enforcement access to that data, the unclear “trusted flagger” provision, and the requirement to appoint a legal representative in country. The final provision directly replicates requirements passed - and used by - repressive governments in India, Turkey, and elsewhere to directly threaten platform representatives with jail time for not complying with unlawful takedown requests.

What’s next

This week’s vote finalised the IMCO position. Plenary will vote on this proposal in January, and the approved text will then become Parliament’s mandate for negotiations. Although it may seem like the DSA is almost a done deal, that’s far from the case. The text could still change in trilogues, and unfortunately we could not only lose the progress we have made, the text could actually get worse.

That is why we will be active every step of the way. Our focus is on ensuring that human rights are respected, pushing for meaningful oversight of companies, and not putting into place provisions that will lead to increased surveillance of human rights defenders nor increased removal of human rights documentation.

logo

Mnemonic is an NGO dedicated to archiving, investigating and memorialising digital information documenting human rights violations and international crimes. Mnemonic also provides trainings, conducts research, engages in content moderation advocacy, and develops tools to support advocacy, justice and accountability.

Donate
aboutcontactpressprivacy policyimprint