Facebook’s parent company Meta has launched a new initiative designed to combat the spread of misinformation online, with a new fact-checking mentorship program, developed in partnership with the Poynter Institute’s International Fact-Checking, to help fact-checking organizations scale their efforts, and increase their impact.

As explained by Meta:

“Reducing the spread of misinformation is a challenge that no single organization can tackle alone. Strong partnerships with subject matter experts and sharing information on best practices plays a big role in effectively addressing misinformation. That’s why, as part of this global mentorship program, the nonpartisan IFCN will select up to 6 experts from the fact-checking industry to serve as mentors for up to 30 organizations in Meta’s third-party fact-checking program.”

Meta will allocate $450,000 in funding to the initiative, which will help improve fact-checking processes through shared education, with a specific focus on helping more organizations in more regions to tackle harmful trends.

Facebook, specifically, has come under intense scrutiny over the role that it plays in facilitating the spread of misinformation, with the recent Facebook Files leak once again underlining the impacts that its News Feed algorithm, specifically, can have on content amplification, through varied incentives and engagement-driving mechanisms which could help to surface more controversial content.

Meta as denied that its platform is responsible for increased division as a result of such, with the Vice‑President for Global Affairs and Communications at Meta Platforms, Nick Clegg, explaining that:

“The increase in political polarization in the US pre-dates social media by several decades. If it were true that Facebook is the chief cause of polarization, we would expect to see it going up wherever Facebook is popular. It isn’t. In fact, polarization has gone down in a number of countries with high social media use at the same time that it has risen in the US.” 

Still, it’s difficult to deny that Facebook has caused more polarization, especially when you view the top ten most shared link posts in the app each day:

Facebook’s algorithms are driven by engagement, and content that generates the most engagement triggers and emotional response, with anger and joy being the most enticing reactions in this respect.

You can see, then, that Facebook’s own systems may inherently be built around driving such engagement, as a business benefit. Which is why any effort to combat misinformation is important – because we all have different opinions, but reports that misdirect those considerations with lies and falsehoods only do harm to broader democracy.

As such, this new initiative will play an important role, and as Meta continues to push Facebook and Instagram use in more regions, it will also need to partner that with increased efforts to tackle localized misinformation to reduce such impacts.



Please enter your comment!
Please enter your name here