“Transparency is an important part of everything we do at Facebook,” reads the first line of a first-quarter Content Transparency Report which Facebook later decided not to share with the public.
They’ve now changed their mind, and released that report. The Hill summarizes its findings:
Facebook said that an article about a doctor who passed away two weeks after getting a coronavirus vaccine was the [#1] top-performing link on the social media platform in the U.S. from January to March, according to a report released Saturday… [The Washington Post adds that this article “was promoted on Facebook by several anti-vaccine groups”.] According to Facebook’s report, the article was viewed over 53 million times…
In addition, a website pushing coronavirus misinformation was one of the top 20 most visited sites on the platform, according to The Washington Post.
Specifically, the Post calls that top-20 site “a right-wing anti-China publication that has promoted the violent QAnon conspiracy theories and misleading claims of voter fraud related to the 2020 election.”
Facebook had considered sharing the 100 most popular items in their newsfeed, the Post adds, but “The problem was that they feared what they might find…”
The disclosure reflects the challenge of being open with the public at a time when the social network is being attacked by the White House as well as experts for fomenting the spread of health misinformation. Previously, the company had only shared how much covid-related misinformation it has removed, and has been careful not to acknowledge up to this point what role they’ve played in disseminating material that mislead the public about the virus and the vaccine. For months, executives have debated releasing both this report and other information, according to a person familiar with the company’s thinking. In those debates, the conversations revolved around whether releasing certain data points were likely to help or hurt the company’s already-battered public image. In numerous instances, the company held back on investigating information that appeared negative, the person said…
Facebook’s leadership has long felt that skepticism about any subject, including vaccines, should not be censored in a society that allows robust public debate… The challenge is that certain factual stories that might cast doubt on vaccines are often promoted and skewed by people and groups that are opposed to them. The result is that factual information can become part of an ideological campaign. Facebook has been slow to remove or block some of the leading anti-vaccine figures that spread such ideas.
Some observations about Facebook’s report:
It only covers public content in a News Feed — so presumably it’s failing to account for any misinformation that’s shared only with a group’s members.
The report acknowledges that nearly 20% of posts in a News Feed come from a Group the user has joined. More than 1 in every 17 content views in the News Feed are recommended by Facebook’s algorithms.
Read more of this story at Slashdot.