When victims face disinformation or online hate speech today, they enjoy little hope of redress. They are more likely to try to look away – closing their browser windows or leaving online spaces altogether – than to seek justice. Even when faced with illegal content such as defamation or harassment – situations in which legal grounds to go to court exist – few possess the financial or emotional resources to hire a lawyer and put pressure on a powerful platform.
Europe’s Digital Services Act (DSA) changes this equation.
It represents the first sustained attempt to modernize rules for online content in a democratic and rule-of-law environment. While preserving free speech, the DSA reinforces access to justice in content moderation decisions, imposing on platforms a variety of scaffolded transparency and accountability requirements.
Among other changes, platforms must open an “internal complaint handling system” where users can submit complaints about content moderation decisions, both in cases of over-moderation (take downs) and under-moderation (when platforms decide not to act on reported content). When complaints contain sufficient grounds, the platform will need to reverse its decision.
While many platforms already have procedures for reporting content and appealing content moderation decisions, their formulation, scope, and accountability vary, and our research shows that all have ample room for improvement. Disinformation and hate speech proliferate, while it is prohibitively challenging for people to appeal wrongful content moderation decisions.
Consider LGBTQ+ hate. In their most recent report on ‘LGBTI-phobies’, the French NGO SOS Homophobie found once again that online environments remain the most prevalent spaces for anti-LGBTQ+ sentiment. At the same time, the NGO notes a decline in reporting, which they attribute to victims and witnesses’ frustrations with effective moderation and insufficient resources devoted to answering their complaints.
Content moderation is particularly lacking in languages other than English. During her testimony before the US Congress, Facebook whistle-blower Frances Haugen revealed that 87% of misinformation spending by Facebook is on English-language content, but only about 9% of their users are English speakers. Even globally spoken languages like Spanish are neglected. In March 2021, a coalition of organizations launched the campaign #YaBastaFacebook calling for better Facebook protection of Spanish-speakers.
Facebook isn’t alone. Spanish YouTube and TikTok influencer Naim Darrechi claimed recently in a viral video that he had legally become a woman by filling out a simple form and that he could change sex to increase his welfare payments every six months. This is false: Spanish law offers no such right. Even now, this misogynistic and transphobic disinformation remains available online.
In such cases – or when initial reporting and appeals processes fail – the DSA will provide the option of “out-of-court dispute settlement.” Independent bodies in different European countries will be established to deal with content moderation disputes. This option will be especially useful for dealing with ‘lawful but awful’ content such as LGBTQ+ hate, much of which would not have grounds to be brought to court under national laws. At the same time, the independent bodies’ decision will not be legally binding, so the justice system can still have the final say.
Another DSA innovation allows representation. Specialized organizations will be able to exercise the rights of individuals under the DSA on their behalf. This will be critical since victims and witnesses of hateful content may be hesitant to appeal content moderation decisions given the emotional or traumatic engagement they have suffered.
The DSA is not just about Europe. It’s reasonable to expect that it will raise the bar globally: once a platform improves redress mechanisms in Europe, it would be logical to harmonize these policies and implement them for all their users.
The DSA is set to enter into force in 2024 for all online platforms. It will come into force even earlier for so-called Very Large Online Platforms, those with more than 45 million European users. This gives platforms plenty of time to design and implement accessible, intuitive appeals systems – and to begin offering Europeans real access to justice online.
Claire Pershan is Policy Coordinator for EU DisinfoLab, an independent non-profit organization focused on tackling disinformation.