Fb Exams New Consumer Content material Controls for Information Feed, New Restrictions for Advert Placement

Amid the ongoing debate about the potential negative impact of Facebook’s News Feed algorithm on general consumer habits, Facebook is testing a number of new control options for both individuals and advertisers that allow people to influence what they see, and brands in the process to help avoid undesirable behavior link via ad placement in the app.

First of all, Facebook wants to make it easier for individual users to find its existing news feed control options, while also giving users the ability to reduce certain types of content in their feeds.

As explained by Facebook:

As part of this, people can now increase or decrease the amount of content they see from friends, family, groups and sites they are connected to and the topics they care about in their News Feed settings. “

Facebook’s news feed settings give you more control over what is shown to you in your feed by choosing favorite profiles which are then given higher priority when posting, stop following pages, people and topics, and pause certain users / pages, all from the uniform lists.

Soon there will be even more control options on this list, with the ability to increase / decrease the displayed content of each item – although it’s not yet clear how exactly that will work.

It could be a great way to give people more control over their feed – but that depends on how many people are actually using it, of course, with a clear reason why they should.

Because of this, updates like these are usually a win-win situation for The Social Network, as they hold users accountable and give them more control, while Facebook itself knows that many won’t care and makes sure that the status quo is largely retained. There isn’t much more in this regard, but hopefully with this new push, Facebook will put more effort into encouraging people to use such controls and maximizing adoption and awareness of such tools.

Algorithmic reinforcement was one of the main concerns highlighted by Facebook whistleblower Frances Haugen in her various statements about the platform’s negative impact. Haugen told the U.S. Senate that Social networks should be forced through reforms of the Section 230 laws to completely abandon the use of engagement-based algorithms.

As explained by Haugen:

“Facebook [has] publicly admitted that ranking based on engagement without integrity and security systems is dangerous, but has not implemented these integrity and security systems in most of the world’s languages. It tears families apart. And in places like Ethiopia it is literally fueling ethnic violence. “

Haugen believes that these algorithms incentivize negative behavior in order to encourage more engagement, and as Haugen notes, this does significant harm in various regions including the US.

It’s difficult to define the real impact of such measures, but it seems pretty clear that Facebook’s algorithms have changed public discourse. divisive and argumentative content. This then leads to more fear and arguments.

Providing user controls to limit the impact could be a good move, but we’ll have to wait and see how Facebook does this. F.acebook says it will begin testing the new control options with “a small percentage of people, which will be gradually expanded over the coming weeks”.

In addition, Facebook is expanding its topic exclusion controls on newsfeeds to a limited number of advertisers who run ads in English.

“Advertiser Topic Exclusion Control allows an advertiser to choose a topic to define how we display the ad on Facebook, including the news feed. Advertisers can choose from three topics: News and Politics, Social and Crime, and Tragedy. If an advertiser selects one or more topics, their ad will not show to people who recently posted those topics in their news feed. “

That way, advertisers can essentially avoid unwanted associations with these topics and the related discussions, which could be a great way for Facebook to reassure brands that they won’t suffer any ill effects from doing so.

Facebook says the bans did very well in early testing to ensure that no ads were shown alongside such discussions in the app.

Again, given the wider debate about the impact of negative interactions on the platform, it makes sense for Facebook to offer more controls that help users improve their experience based on their own expectations and interests, while also providing greater security for brands.

Of course, ideally, if research shows such changes have positive overall effects, you’d ideally hope that Facebook would try to reduce these negative elements more broadly, but that’s another aspect it needs to look into – and possibly even may be forced to to further examine whether the recommendations made by Frances Haugen are adopted by the regulatory authorities.

Comments are closed, but trackbacks and pingbacks are open.