Q4 2025 marked a significant increase in the number of cases handled by Vigilia, across many categories: reporting content that goes against the terms & conditions that platforms (pretend to) enforce, reporting content that is illegal in the EU, fighting to restore users’ posts when they were taken down for no valid reason…
Q4 2025 statistics
Number of reports made to the platforms: 2,182
Number of times the platforms agreed with our assessment upon first review: 208
Number of cases taken to arbitration: 1,781
Number of times the platform recognized its mistake upon notification of the opening of arbitration proceedings: 296
Number of cases ruled in Vigilia’s favor: 356
Number of cases ruled in the platform’s favor: 67
Number of cases still pending: 1,057
Overall ‘win’ rate for Vigilia (on cases for which a decision was finalized): 89.2%
Overall ‘good flag’ rate for Vigilia (cases won in arbitration + notices directly actioned by the platforms before arbitration): 92.8%
Platforms with which disputes were opened (by descending number of cases): TikTok, Instagram, LinkedIn, Facebook, Meta Ads, YouTube, Pinterest, X/Twitter
Number of certified out-of-court dispute settlement bodies with which Vigilia opened disputes: 5
Languages of disputed content: French, English, German, Spanish, Italian, Romanian, Slovak, Russian, Dutch, Arabic
Top 3 dispute categories (descending order): Scams and impersonation, hateful content, illegal content
See this companion post for a selection of cases we found to be particularly interesting.
Ecosystem recommendations
Platforms
- Vigilia’s high success rate suggests that something is fundamentally broken with platforms’ systems to handle violative content.
- The ease with which violative content is found by a small civil society organization like ours is worrisome in the context of platforms’ obligations to mitigate systemic risks. If we can do it, we presume that they should be able to as well.
- Most (but not all) platforms make it difficult to report content (caps on the number of reports made by an account, absence of unique report identifiers, no possibility to bulk-export reports, many overlapping steps in the reporting flow…), resulting in human-intensive reporting and therefore costs for reporting organizations. Because those costs will ultimately be borne by the platforms in cases when the user wins the arbitration, it appears to be in every party’s interest to streamline reporting processes.
- Some platforms, chief among which X/Twitter, do not currently accept cases from out-of-court dispute settlement bodies, effectively leaving users without recourse when they disagree with a decision made by the platform. Given the many well-documented moderation issues with X/Twitter (latest among which is the Grok nudifier scandal, which includes child sexual abuse material), this is particularly worrisome. Vigilia encourages all platforms to cooperate with out-of-court dispute settlement bodies in good faith.
Out-of-court dispute settlement bodies
- Standardization of processes and evidence requirements would be welcome. There are currently some small variations in the information required by each ODS body to open a case (e.g. need to provide a username on the platform or not, need to provide a screenshot of the platform decision or not…). Taken together, these small differences across ODS bodies amount to significant friction that makes scaling the ODS mechanism as a whole more difficult. Because the DSA applies equally to all ODS bodies, we would expect the operational requirement to be the same across ODS bodies and encourage ODS bodies to converge towards a common set of procedures.
- Further, we encourage ODS bodies to explore the possibility of creating a common technical infrastructure (or, at a minimum, a common set of technical standards) to facilitate the sending and processing of cases.
- We encourage ODS bodies to create a platform to discuss edge cases. Vigilia’s experience is that in the vast majority of cases, different ODS bodies will interpret applicable rules in the same way when the specifics of the case are similar. However, we encourage the creation of a mechanism through which edge cases can be discussed among ODS bodies. Over time, we expect ODS bodies to converge on a harmonized interpretation of applicable rules, so as to avoid forum shopping and ensure fairness for all parties (platforms, users, ODS bodies).
.png)
.png)