Appeals Centre Europe: Dublin-headquartered social media watchdog to begin operations

A new industry-funded, Dublin-headquartered body to oversee social media content starts operations today.

featured-image

A new industry-funded, Dublin-headquartered body to oversee social media content starts operations today. The Appeals Centre Europe (ACE) body, staffed by 25 people, will make decisions on appeals from European ­users over major social media platforms’ decisions to take down or leave up pieces of content, including posts, videos, photos and comments. It will cost €5 to lodge the appeal – at appealscentre.

eu – which is refundable if the watchdog agrees with the complainant. However, it will initially only apply to Facebook, TikTok and YouTube – not Instagram or X. It will handle dispute appeals ranging from bullying and harassment to misinformation, hate speech and altered images and videos.



Decisions will be made within 90 days, although a spokesperson for the organisation said that “in most cases, it will be quicker than that”. There will be no hierarchy for expedited decisions, the spokesperson said. “Once a dispute is submitted, it will be evaluated by an expert team who will then seek further information from the social media platform in question,” the spokesperson added.

The organisation, which has been certified by Ireland’s Coimisiún na Meán, says it will accept submissions in six languages from the start – English, French, German, Dutch, Italian and Spanish. Platforms are not obliged to accept the decisions, but must engage with them “in good faith”, according to the provisions of the EU’s Digital Services Act. The Appeals Centre will be mostly funded through a €95 fee charged to social media companies for each case.

If the centre decides in the platform’s favour, then they keep the user fee of €5, reducing the cost to €90. “New EU laws are bringing new powers to users of social media, and that’s what the Appeals Centre provides,” Thomas Hughes, CEO of the Appeals Centre, said. “While platforms often have the right rules in place, the sheer number of posts means they don’t always apply them correctly.

And when platforms make mistakes, users pay the price. Journalists’ reports are removed just for naming terrorist groups. Posts showing breast cancer symptoms are taken down, despite exceptions for raising awareness in this area.

And in other cases, policy-­violating hate speech is left up, as people game the system to avoid detection.” The organisation will have a board of seven non-executive directors, including four “with no relationship to any social media company or entity funded by a social media company”..