Facebook fake news fueled by six-fold click bias over factual content
A fresh analysis of the appeal and spread of misinformation on Facebook has quantified the pernicious appeal of salacious and agenda-driven misinformation to drown out factual content.
Peddlers of misinformation were generating six times as many clicks as authoritative sources such as CNN or WHO
During the 2020 US election, so-called fake news generated six times as many clicks as factual content to spread like wildfire across the platform, with right-wing authors producing a greater proportion of unsubstantiated content than their left-leaning contemporaries.
Misinformation has long been known to trigger itchy index fingers to gravitate toward sensationalist headlines, but a behavioral study of Facebook users conducted by researchers at New York University and the Université Grenoble Alpes in France has now quantified this effect for the first time.
The peer-reviewed findings are likely to reignite controversies surrounding a cantankerous campaign in which both Democrats and Republicans accused Facebook’s algorithms of fueling discord and rancor among voters. At the height of campaigning, peddlers of misinformation were generating six times as many clicks as authoritative sources such as CNN or the World Health Organization.
The study, published in The Wall Street Journal, analyzed 2,551 pages categorized by NewsGuard and Media Bias/Fact Check on where they stand on the political spectrum, as well as their predilection for sharing dubious stories. These were then compared with interactions with notorious fonts of misinformation such as Occupy Democrats and Breitbart, as well as more staid factual publishers.
Researchers duly found that misinformation of whatever political flavor generated much more interest than dry factual pages, with right-leaning sources being somewhat more prolific.
Responding to the report, Facebook did not dispute the findings but stressed that the figures represented engagement and not total page impressions, which are not divulged to researchers. Spokesman Joe Osborne explained: “This report looks mostly at how people engage with content, which should not be confused with how many people see it on Facebook. When you look at the content that gets the most reach across Facebook, it is not at all like what this study suggests.”
A damning spring study into the proliferation of hate speech on Facebook found that abusive, threatening and prejudicial language comprised six out of every 10,000 views.
Content created with:
Meta
Our products empower more than 3 billion people around the world to share ideas, offer support and make a difference.
Find out more