Misinformation is thriving on Facebook, researchers have found. A new project has shown that publishers who post misinformation are drawing much greater engagement than reliable sources—six times the number of shares, likes, and other interactions, the Washington Post reports. The study, which has been peer reviewed, was conducted during the election period of August 2020 to January 2021. Its findings "add to the growing body of evidence that, despite a variety of mitigation efforts, misinformation has found a comfortable home—and an engaged audience—on Facebook," said Rebekah Tromble, director of the Institute for Data, Democracy and Politics at George Washington University. Tromble reviewed the study findings.
Facebook disputes that interpretation, saying that the number of people who engage with its content—which is what researchers counted—is different from the number of people who see it. "When you look at the content that gets the most reach across Facebook, it is not at all like what this study suggests," spokesman Joe Osborne said. However, the company doesn't share data on the number of times content shows up on someone's screen with researchers. The study examined the Facebook pages of more than 2,500 news publishers, per the Verge. The study was produced by New York University and France's Université Grenoble Alpes.
The advantage was enjoyed by misinformation on far-left and far-right sites, researchers found, over posts by mainstream news outlets or institutions such as the World Health Organization. But there's more of it on far-right sites. That was found before by other research, including Facebook's, and confirmed by the new study. There's reason to think the content can be influential, per the Post. A recent survey showed that US Facebook users were less likely to be vaccinated for the coronavirus than any other type of news consumer. The study will be presented in November at the Internet Measurement Conference in November but could be made public sooner. (More Facebook stories.)