Thursday, April 25, 2024

Section

বাংলা
Dhaka Tribune

Misinformation on Facebook gets way more engagement than news, study says

A US survey shows misinformation got six times more clicks than factual news

Update : 05 Sep 2021, 12:15 AM

Facebook algorithms fuel the spread of misinformation over trustworthy sources, says a recent study on user behaviour on Facebook.

The forthcoming peer-reviewed study by researchers at New York University and the Université Grenoble Alpes in France found that from August 2020 to January 2021, news publishers known for putting out misinformation got six times the amount of likes, shares, and interactions on the platform as did trustworthy news sources, such as CNN or the World Health Organization, reports The Washington Post.

According to it, publishers who traffic misinformation were able to gain major audiences ever since “fake news” on Facebook became a public concern following the 2016 US presidential elections.

Citing experts, the report said that the NYU study is one of the few comprehensive attempts that measure and isolate the misinformation effect across a wide group of publishers on Facebook showing that the platform rewards publishers that put out misleading accounts.

“The study helps add to the growing body of evidence that, despite a variety of mitigation efforts, misinformation has found a comfortable home — and an engaged audience — on Facebook,” the Washington Post quoted Rebekah Tromble, Director of The Institute for Data, Democracy and Politics at George Washington University, who reviewed the study’s findings.

The study’s authors relied on categorizations from two organizations that study misinformation, NewsGuard and Media Bias/Fact Check ReutersResponding to her comment, Facebook said that the report measured the number of people who engage with content, but that is not a measure of the number of people that actually view it (the latter number, called impressions, publicly available to researchers).

“This report looks mostly at how people engage with content, which should not be confused with how many people actually see it on Facebook,” Facebook spokesman Joe Osborne told the Washington Post.  "When you look at the content that gets the most reach across Facebook, it is not at all like what this study suggests.”

He added that the company has 80 fact checking partners covering over 60 languages that work to label and reduce the distribution of false information.


Also Read: Report: Facebook could launch digital wallet this year


The study’s authors relied on categorizations from two organizations that study misinformation, NewsGuard and Media Bias/Fact Check.

Both groups categorized thousands of Facebook publishers by their political leanings, ranging from far left to far right, and by their inclination to share trustworthy or untrustworthy news. 

The team then compared the interactions on posts on pages by publishers known for misinformation.

The researchers also discovered that misinformation boost is politically neutral — misinformation-trafficking pages on both the far left and the far right generated much more engagement from users.

The study also found that the publishers with right wing inclinations have a much higher propensity to share misleading information than publishers in other political categories.

According to the Washington Post, the finding resonated with other research conclusions as well as Facebook’s own internal findings ahead of the 2018 US midterm elections.

Facebook’s critics have long alleged that misleading, inflammatory content that reinforces the viewpoints of its viewers garner significantly more attention and clicks than mainstream news.

Conspiracy theories about Covid-19 and vaccines, along with misleading information about treatments and cures, have also gone viral.

Facebook cut off the accounts of NYU researcher, Laura Edelson, who conducted the study, and her colleagues arguing that her data collection put Facebook potentially in violation of a 2019 US Federal Trade Commission privacy settlement <strong>Reuters</strong>A recent survey by COVID States Project found US Facebook users were less likely to be vaccinated than any other type of news consumer. 

US President Joe Biden went so far as to say that Covid-related misinformation on platforms such as Facebook was “killing people,” a comment he later walked back.

However, there has been little hard data to back up the assertions about the harm caused by Facebook’s algorithms, in part because Facebook has limited the data that researchers can access, according to Tromble.

The Washington Post reports that Facebook is also increasingly restricting access to outside groups that make attempts to mine the company’s data. 

In the past several months, the White House has repeatedly asked Facebook for information about the extent of Covid misinformation on the platform, but the company did not provide it.

Facebook also clamped down on NYU researcher, Laura Edelson, who conducted the study, says the Washington Post report. 

The company cut off Edelson and her colleagues’ accounts last month, arguing that her data collection put Facebook potentially in violation of a 2019 US Federal Trade Commission privacy settlement.

In a rare rebuttal, the commission shot back saying that the settlement makes exceptions for researchers and that Facebook should not use it as an excuse to deny the public the ability to understand people’s behaviour on social networks.

In response to criticism that it is becoming less transparent, Facebook recently published a new transparency report that shows the most popular content on the platform every quarter.

The report, however, was highly curated and Facebook censored an earlier version of the report out of concerns that it would generate bad press, according to a person familiar with the matter, says the Washington Post report.

Tromble also said that one of the reasons it is hard to tell how much exposure people have to misinformation on Facebook in particular is because so much content is shared in private groups.

Edelson said the study showed that Facebook algorithms were not rewarding partisanship or bias but amplifies misinformation because it does well with users.

Top Brokers

About

Popular Links

x