1/3/2024

Collective verification or online truth arbitration? A look at X Community Notes

In recent years, social media companies have found in data verification organizations and agencies an important ally to mitigate misinformation on their platforms. However, this strategy has some limitations, including the lack of capacity of these actors to counteract the volume and scope of misleading content online. 

On the other hand, the design of these programs has meant that, in some cases, journalists and verifiers performing this task have been exposed to online harassment and threats. For example, during the Covid-19 pandemic, Meta's protocols for working hand in hand with fact-checking organizations led to verifiers being subjected to harassment campaigns and threats. 

In the search for an alternative system, X (formerly Twitter) recently introduced "Community Notes", a collaborative mechanism that departs from traditional verification methods. This initiative, originally conceived by Jack Dorsey - founder and former CEO of the company - was launched in 2021 as a pilot project in the United States under the name of. "Birdwatch.". Its purpose was to allow users to flag potentially misleading content and add notes to clarify the context of a post. In Elon Musk's tenure, this approach was expanded globally under the name "Community Notes."

How do Community Notes work?

X's system relies on volunteer users, called "contributors," to add context to posts they believe may be misleading. Requirements to be a contributor include not having recently violated X's rules, having an account at least six months old, and a verified phone number. All contributors start with the ability to rate how useful they think a post is and, over time, may gain the power to write them. 

In addition to the contributors' opinion on the usefulness of the notes, it is necessary that those who rate the note have different points of view. "If people who do not usually agree on the usefulness of notes agree that a given note is useful, this is probably a good indicator that the note will be useful to people with different points of view," the platform states.

It is important to note that "Community Notes" are subject to reporting if they contravene the platform's policies. Also, X allows authors of posts to request additional reviews from contributors, especially if they feel that a note marked as "useful" in their post does not provide relevant context or should not be included in the post.

How well does this strategy work?

Although Elon Musk, the owner of X, described this mechanism as a radical change to improve reliability on the platform, a recent study by Valerie Wirtschafter and Sharanya Majumder, researchers at the Stanford Internet Observatory, casts doubt on the large-scale effectiveness of this model should other social networks seek to emulate it.

According to the document, more than 80% of the contributors to the "Community Notes" have not written a note that has been rated "useful". Furthermore, out of 52,000 notes evaluated in the research, only 7% met the criterion of being useful. According to the researchers, this system faces two main obstacles: advances in artificial intelligence, which restrict the ability to detect false or altered content, and the difficulty of reaching consensus among contributors due to the polarization of the information environment.

Likewise, this collaborative model is not exempt from vulnerabilities that can lead to the publication of biased reports, as evidenced in a recent incident. On February 15, U.S. Senator Richard Blumenthal shared a message on X highlighting the strong bipartisan support for the Kids Online Safety Act, a bill to protect minors online that has been criticized by digital rights organizations for its potential risks to users' privacy and freedom of expression. 

Three days later, the contributors to "Community Notes" added to Senator Blumenthal's publication a note pointing out the bill as a "Trojan horse for Internet censorship". Beyond the diverse opinions that may arise from the discussion of a norm, the event highlights the capacity of these models to influence with a certain degree of authority and take sides in political discussions.

Although X's "Community Notes" aim to democratize data verification, their effectiveness is still limited by the propensity for bias and the difficulty in reaching consensus among collaborating users. While the system's ability to verify information in real time is plausible, additional strategies are needed to ensure its objectivity and prevent the model from becoming a form of online truth arbitration.