Fb Expands Local weather Science Middle to Extra Areas, Ramps Up Local weather Misinformation Detection

Facebook is taking stronger action to advance climate science and combat related misinformation on its platforms as part of a renewed push for broader, more inclusive global efforts to address the growing climate crisis.

As explained by Facebook:

Climate change is the greatest threat we all face – and the need for action is growing every day. The science is clear and unambiguous. As leaders, advocates, environmental groups and others meet in Glasgow this week at COP26, we want bold action to be agreed, with the strongest commitments to hit net zero targets that will help keep warming to 1, 5 ° C. “

Facebook has been repeatedly identified as the main source of climate misinformation and clearly plays a role in this regard. But with this renewed stance, the company is trying to set clear parameters of what is acceptable and what it would like to take action on in order to contribute to a wider push.

First, Facebook is expanding its Climate Change Science Center to more than 100 countries while adding a new section that shows each nation’s greenhouse gas emissions versus their commitments and goals.

Facebook first launched its Climate Change Science Center in September last year to provide users with more accurate climate information. The data underlying the updates contained in the Center comes directly from leading information providers in the field, including the Intergovernmental Panel on Climate Change, the UN Environment Program and more

The additional tracking data for each nation provides an additional level of accountability, which could put pressure on each region to meet its commitments through wider coverage and awareness of its progress.

Facebook is expanding its too Information labels to posts on climate change directing users to the Climate Science Center for more information on related topics and updates.

Facebook climate misinformation labels

Facebook is also taking more measures to combat misinformation about the climate during the COP26 climate summit:

Prior to COP26, we activated a feature that we use at critical public events to use keyword recognition to make related content easier for fact checkers to find – speed is particularly important at such events. This feature is available for fact checkers for content in English, Spanish, Portuguese, Indonesian, German, French, and Dutch.

I mean, that begs the question of why they wouldn’t use this procedure all the time, but it is assumed that this is a more labor-intensive approach that can only be done on a short-term basis.

By combating such claims as they grow (Facebook also notes that climate misinformation ‘increases periodically as the discussion about climate change intensifies‘), which should help mitigate the impact of such measures and negate some of the networking effects of Facebook’s size in terms of reinforcement.

Finally, Facebook also says it is working to improve its own internal operations and processes in line with emissions targets.

Since last year we have achieved net zero emissions for our global activities and are 100% supported by renewable energies. To achieve this, we have reduced our greenhouse gas emissions by 94% since 2017. We invest enough in wind and solar energy to cover our entire operations. And for the remaining emissions, we support projects that remove emissions from the atmosphere. “

The next step in its business operations will be to work with suppliers who are also striving for a net zero value, which will fully offset the business impact when fully implemented.

Facebook’s record in this regard is patchy, not because of its own initiatives or efforts as such, but because of the way controversial content can be amplified by the News Feed algorithm, which inadvertently incentivizes users to get more links share -centered, controversial, and anti-mainstream views to get attention and engage in the app.

This is a major problem with Facebook’s systems that Facebook itself has pointed out repeatedly, albeit indirectly. Part of the reason this type of content is getting increased attention on the platform isn’t because of Facebook itself, but rather because of its human nature and the ability to share and engage with topics that resonate with them. Facebook says this is a human problem, not a Facebook problem.

As Facebook’s Nick Clegg recently stated on a similar issue in a wider political divide:

The increasing political polarization in the US precedes social media for several decades. If it were true that Facebook is the main reason for polarization, we would expect it to increase wherever Facebook is popular. It is not. In fact, polarization has decreased in a number of countries with high social media usage, while it has increased in the US.

So it is not Facebook that is the problem, as Clegg quoted, but the fact that people now have more opportunities to discuss and engage with them may help Facebook play a bigger role.

But that leaves Facebook a bit off the hook. A major issue is the incentive that Facebook has built in with regards to likes and comments, and the dopamine rush people get from it. That gives people a reason to share more controversial content because it triggers more notifications and increases their presence – so there is inherently a process on Facebook that drives this type of behavior, whether or not Facebook itself wants to acknowledge it not.

So it is important for Facebook to take action – the real question, however, is how effective or even can such countermeasures be, especially on the scale of Facebook?

Comments are closed, but trackbacks and pingbacks are open.