28/5/2024

Trouble in the Fediverse: a look at the moderation limitations of decentralized social networks.

Some time ago, when the Twitter ship began to sink, a new group of platforms appeared on the scene that then seemed to be the promised land: federated or decentralized social networks. Mastodon welcomed at that first moment many dissidents of the social network that Elon Musk had just bought, and for a moment it was believed that this movement would be the first step towards a new phase of online interaction.

A lot has happened since then: the initial enthusiasm for Mastodon waned and other major players appeared: BlueSky and especially Threads, the first bet by an industry bigwig to create such a social network. Now, with a little more distance, we can take a closer look at this new type of platform and the moderation and governance challenges ahead.

Federated or decentralized social networks bring together platforms connected through shared protocols. One of the most successful so far is Activity Pub, on which Mastodon is built and Meta is already testing to integrate Threads into the Fediverse. 

Many of them were born as non-profit organizations and with the promise of offering alternative governance structures, in which users have greater autonomy over the content they want to access and how it should be curated. In this way, publications and the rules that moderate them depend more on the will and interests of users than on algorithms and guidelines defined by a company. 

Unlike what happens in traditional social networks, in this type of platform there is no single space that manages and distributes content. Instead, it is compartmentalized in different servers, which, having the same protocol, can interact with each other. 

In federated networks, users can host on their own servers or on those of third parties. In this way, they have the possibility to set their moderation rules according to their preferences. 

In a recent report, researchers Yoel Roth - former director of Trust & Safety at Twitter - and Samantha Lei analyzed the main limitations of systems such as Mastodon or BlueSky to respond to threats and control influence or spam operations. 

According to the research, these platforms face considerable obstacles to strengthening their governance schemes, such as insufficient moderation technologies and the lack of a sustainable financial model that would allow them to strengthen security and trust teams. 

The decentralized spirit of these platforms is also a limitation when it comes to acting against harmful content or behaviors that aim to manipulate the networks, as there is usually no path that allows the administrators of one space to be aware of threats present in another. 

In the case of Mastodon and BlueSky, for example, moderators do not have the ability to block URLs to harmful sites. This limits the ability to stop inbound traffic to spam or scam sites. 

According to a survey conducted as part of the research, Fediverso moderators reported that they do not have formal guidelines or training for their work. Some even reported having experienced burnout as part of their work to manage the content on their servers, without the possibility of seeking help or guidance from higher levels.

In addition, federated platforms also do not have well-established transparency practices, as there is no information to understand how they apply their own moderation standards. Although regulations such as the European Union's Digital Services Act already impose such obligations on technology companies, the size of many of these networks still allows them to be exempted.

It is possible that as these platforms grow in number of users, they will come fully on the radar of regulators and there will be more pressure from civil society to ensure the security of these spaces. It is also possible that such growth will enable them to meet their needs for security and trust.

For now, the researchers suggest a series of measures that could improve the quality of moderation in these spaces. Among them, implementing systems that allow institutional responses to spam threats and manipulation of the platforms and investments in open access tools. 

Given the financial model of these companies, Roth and Lei suggest that the latter could be solved through a central entity that manages resources, maintains the tools and connects developers with funders who want to invest in these spaces. 

Since March of this year, Meta has been running pilot tests so that some Threads users can share their posts on other Fediverse servers. Although there is still no date for a possible full integration, given the size of the company, this could be the starting signal to think about a different digital ecosystem. For the same reason, from now on it is key to follow up on the security and trust measures that these and other networks are implementing to make the Fediverse a safe space.