Fb Clashes with the US Authorities Over Vaccine Misinformation

0

It seems like Facebook is on a collision course with the U.S. government again, this time because of the role it may or may not play in amplifying misinformation about COVID-19 vaccines, which have been identified as a major obstacle in The Nation’s Path for recovery from the pandemic.

When US President Joe Biden was asked directly about vaccine misinformation on Facebook on Friday, he replied that “they kill people” by allowing vaccine conspiracy theories to spread.

Biden’s comment came a day after the White House also determined it was in Regular contact with social media platforms to ensure they are informed of the latest narratives that pose a public health risk

According to White House Press Secretary Jen Psaki:

“We are working to work with them to better understand the enforcement of the guidelines for social media platforms.”

In response to Biden’s remarks, Facebook immediately went on the offensive, with a Facebook spokesman telling ABC News that it “ not be distracted by allegations that are not supported by the facts “.

Facebook followed suit today with an official response in a post titled “Moving Past the Finger Pointing”.

At a time when COVID-19 cases in America are on the rise, the Biden government has decided to blame a handful of American social media companies. While social media plays an important role in society, it is clear that we need a societal approach to end this pandemic. And facts – not allegations – should help to support this effort. The fact is, vaccine adoption has increased among Facebook users in the US. These and other facts tell a very different story from what the government has been promoting in recent days. “

The post continues to highlight various studies showing that Facebook’s efforts to combat vaccination hesitation are working and that contrary to Biden’s remarks, Facebook users are less, if at all, less resistant to vaccination efforts.

Which is largely in line with Facebook’s broader stance lately – that, based on academic research, there is currently no definitive association between increased vaccine reluctance and Facebook sharing, nor is there a direct link between Facebook usage in a similar way and political polarization, despite ongoing claims.

In the past few months, Facebook has taken a more proactive approach to ditching these ideas by stating that pLarizing and extremist content is actually bad for their business, although it suggests that they benefit from the associated engagement with such posts.

According to Facebook:

“All social media platforms, including but not limited to ours, reflect what is happening in society and what concerns people at any given time. This includes the good, the bad, and the ugly. For example, in the weeks leading up to the World Cup, posts on football will of course increase – not because we programmed our algorithms so that people show content about football, but because people think about it. And just like politics, football touches a deep emotional nerve in people. How they react – the good, the bad, and the ugly – are reflected on social media. “

Facebooks Vice President for Global Affairs Nick Clegg took a similar point of view back in March in his post on the News Feed, which is an interaction between people and the platform – which means that the platform itself cannot bear the full blame:

The goal is to make sure you see what you find most meaningful – Not so that you can stick to your smartphone for hours. Think of it like a spam filter in your inbox: it helps filter out content that you don’t find useful or relevant and prioritizes content that you prefer. “

Clegg further notes that Facebook is actively promoting the Distribution of Sensational and misleading content, as well as posts found to be incorrect by independent fact-checking partners.

For example, Facebook ranks clickbait (headlines that are misleading or exaggerated), highly sensational health claims (such as those promoting ‘miracle cures’), and engagement bait (posts that explicitly try to get users to engage with them) down. “

Clegg also says that, contrary to its own business interests, Facebook was particularly heavily involved in this by implementing a change to the news feed algorithm back in 2018 to receive updates from your friends, family, and groups that you are a part of Content from pages you follow.

According to Facebook, it doesn’t benefit from sensational content and left-of-the-middle conspiracy theories – and in fact, it actually goes to great lengths to punish it.

However, despite these claims and references to inconclusive scientific papers and internal studies, the larger evidence does not support Facebook’s stance.

Earlier this week, the New York Times reported that Facebook was working to change the way its own data analytics platform works to restrict public access to insights that show far-right posts and misinformation outperform more balanced coverage and reports .

The controversy stems from this Twitter profile, created by Times reporter Kevin Roose, which uses CrowdTangle data to provide a daily list of the top ten most interesting posts on Facebook.

The top performing link posts from US Facebook pages in the past 24 hours were from:

1. For America
2. Taunton Daily Gazette
3. Ben Shapiro
4. Ben Shapiro
5. Sean Hannity
6. Nelly
7. Ben Shapiro
8. Newsmax
9. Dan Bongino
10. Ben Shapiro

– Facebooks Top 10 (@ FacebooksTop10) July 14, 2021

Far right sites always dominate the table, which is why Facebook previously tried to explain that the metrics used in creating the post are incorrect and therefore not indicative of actual engagement and popularity of posts.

According to the NYT report, Facebook had gone even further internally as employees looked for a way to change the data displayed in CrowdTangle to avoid such a comparison.

What didn’t go as planned:

“Several executives suggested posting reach data on CrowdTangle in the hopes that reporters would quote that data instead of the engagement data they thought made Facebook look bad. But [Brandon] Silverman, CEO of CrowdTangle, responded in an email that the CrowdTangle team had already tested a feature for it and found problems with it. One problem was that false and misleading messages were also at the top of those lists. “

No matter how Facebook wanted to spin it, these types of posts were still gaining traction, which shows that even with the updates and processes mentioned above to limit such sharing, this remains the type of content that will see and achieve the most engagement The social network.

Which, it could be argued, is more of a human problem than a Facebook problem. But with 2.8 billion users offering more content-enhancement potential than any other platform in history, Facebook needs to take responsibility for the role it plays in the process and the role it may play in enhancing it the impact of such in the case of a pandemic where vaccine fear could take an immeasurable toll on the world.

It seems pretty clear that Facebook is playing a significant role in this. And when you consider that around 70% of Americans now get at least some of the news content from Facebook, the app has become a source of truth for many, informing their actions, including their political attitudes, their civic understanding. And yes, their opinion on public health advice.

Damn it, even flat earth electrodes have gained momentum in the modern age and underline the power of anti-science movements. And while it can’t be definitively said that if someone posts a random video of flat earthers trying to prove their theory, Facebook is responsible for it, it’s likely to gain traction because of the divisive, sensational nature of that content – like this one Clip for example:

Videos like this draw believers and skeptics alike, and while a lot of the comments are critical, according to Facebook’s algorithmic judgment, that’s all when it comes to engagement.

Hence, even your derisive remarks will help make such material more prominent – and the more people comment, the more momentum such posts become.

8 out of 10 people might dismiss such theories as total junk, but 2 could take the opportunity to dig deeper. Multiply that by the number of views these videos see and that’s a huge potential impact on this front that Facebook enables.

And definitely these types of posts are gaining traction. A 2019 study by MIT found that False reports on Twitter are 70% more likely to be retweeted than true, while further research into motivations for such activities has shown that the need for belonging and community can also cement groups around lies and misinformation as a psychological response.

There is also another key element – the changing nature of media distribution itself.

How Yale University social psychologist William J. Brady recently stated:

“When you post things [on social media], You are very aware of the feedback you receive, the social feedback regarding likes and shares. So when misinformation appeals to more social impulses than the truth, it gets more exposure online, which means people feel rewarded and encouraged to spread it. “

This shift in giving each person their own personal motivation to share certain content has changed the paradigm for content reach, which has diluted the influence of the publications themselves in favor of algorithms – which in turn are driven by people and their needs for confirmation and approval Answer.

You’re sharing a post that says “Vaccines Are Safe” and probably nobody cares, but if you share a post that says “Vaccines are dangerous,” people will notice and you will get all the notifications of all the likes, shares and comments which then trigger your dopamine receptors and make you feel like you are part of something bigger, something more – that your voice matters in the wider landscape.

Hence, Facebook is right in pointing out human nature as the culprit and not its own systems. But they and other platforms have given people the medium, they offer the means to share, they develop the incentives to keep posting them.

And the more time people spend on Facebook, the better it is for Facebook’s business.

One cannot argue that Facebook is not benefiting in this regard – and as such, it is in the company’s best interest to turn a blind eye and pretend there are no problems with its systems and the role it plays in amplifying such movements plays.

But it is, and the US government is right to examine this element more closely.

Leave A Reply

Your email address will not be published.