Meta has been making some changes to reduce the distribution of political content on Facebook since last year. Now, new steps have been taken, after the company announced yet another change to the news feed algorithm. The feature, implemented after a battery of tests, will affect users in the US and Brazil.
- Meta’s new artificial intelligence can translate up to 200 languages
- How to promote with Instagram for best result | Practical Guide
The move is part of an action initiated in February 2021. At that time, CEO Mark Zuckerberg stressed that “people don’t want political content to prevail in their news feeds,” as stated in a press release. Thus, the company began to apply efforts to meet the orders.
“Our first step will be to temporarily reduce the distribution of political content in the news feed to a small percentage of people in Canada, Brazil and Indonesia this week, and the United States in the coming weeks,” they said at the time. The measure aimed to assess the different ways to classify political content in people’s feeds.
Since then, several measures have been implemented. Until, on May 24, 2022, the social network revealed a new experiment: reducing the weight of comments and shares for distributing political content in the algorithm’s ranking . In this regard, users who tested the intervention said that they “saw less content in their feeds that they did not find valuable”.
Testing has been expanded globally. But in Brazil, this change has already become a reality for everyone: this Thursday (7) , the platform’s controller said that it will give “less emphasis on comments and shares to determine the distribution of political content on Facebook in the country” . The measure is already in action.
Meta did not go into much detail. But, to Poder360 (in Portuguese), the company informed that other values will be analyzed when distributing political content: “who posted the content and when; if you’ve interacted with that person, page or group in the past; and if the post is a photo, video or link, among many others”.
Read also>> Elon Musk gives up on buying Twitter, and company threatens to sue billionaire
Meta reduced the distribution of political content in the feed
The action is the result of an effort to reduce the amount of political content. But note that the focus is on decreasing distribution, not banning. No wonder Meta herself said last year that she is not removing this information.
“Our goal is to preserve people’s ability to find and engage with political content on Facebook, while respecting their interest in what they want to see at the top of their News Feed,” they stated in February 2021.
And all this concern has an explanation behind it. According to the parent company based on its analysis in the United States, “political content represents only 6% of what people see on Facebook”. But even if the proportion is small and the feed is personalized, the social network knows that “even a small percentage of political content can impact someone’s experience in general”.
Facebook tests have reached new heights since 2021
After the announcement, Meta updated the publication with new information about the process. On August 21, 2021, for example, the company claimed that it saw positive results during the first experiment announced in February. And so they decided to expand it to other countries: Costa Rica, Spain, Ireland and Sweden.
“We also identified that some signs of engagement can better indicate which posts people find more valuable than others,” they said at the time. “Based on this feedback, we are gradually expanding some tests to put less emphasis on signals like how likely someone is to comment on or share political content.”
At the time, the social network said it placed more emphasis on “new signals about how likely people are to provide negative feedback to posts about political topics and current events when we rank these types of content in the feed.” The company added that “these changes will affect public affairs content more widely and that publishers may see an impact on their traffic.”
The experiment continued to be expanded to other countries on 13 October. Then, on December 9, tests on the political content ranking were taken to Facebook Watch. Until, on May 24, the social network started to verify the impacts by reducing the weight of comments and shares for the distribution of political content, a change that was implemented in Brazil this Thursday (7).