Meta’s algorithms reveal that America’s political polarization lacks a straightforward solution

Meta’s algorithms reveal that America’s political polarization lacks a straightforward solution

The use of algorithms by Facebook and Instagram to deliver content to users has often been criticized for amplifying misinformation and political polarization. However, a series of groundbreaking studies published in Science and Nature suggest that addressing these challenges is not as straightforward as modifying the platforms’ software.

The research papers, published in Science and Nature, shed light on the presence of political echo chambers on Facebook, where conservatives and liberals rely on different sources of information, interact with opposing groups, and consume varying amounts of misinformation.

Algorithms serve as automated systems that social media platforms utilize to suggest content based on a user’s past clicks, such as groups, friends, topics, and headlines. While these algorithms excel at keeping users engaged, they have faced criticism for amplifying misinformation and ideological content, thereby exacerbating political divisions in the country.

Regulating these algorithms has been widely discussed as a potential solution to tackle the spread of misinformation and the promotion of polarization on social media. However, when researchers modified the algorithms for certain users during the 2020 election, they observed minimal differences.

“We find that algorithms have a significant influence on people’s experiences on the platform and that there is a notable ideological segregation in political news exposure,” said Talia Jomini Stroud, director of the Center for Media Engagement at the University of Texas at Austin. She is one of the leaders of the studies. “We also find that popular proposals to change social media algorithms did not alter political attitudes.”

While political differences are natural in a healthy democracy, polarization occurs when these differences start to separate citizens and erode societal bonds. It can undermine trust in democratic institutions and the free press.

Severe polarization can lead to “affective polarization” in which citizens view each other as enemies rather than legitimate opposition. This situation can escalate into violence, as shown by the attack on the U.S. Capitol by supporters of then-President Donald Trump on January 6, 2021.

The researchers obtained unprecedented access to Facebook and Instagram data from the 2020 election through a collaboration with Meta, the platforms’ owners. Meta had no control over the findings of the research.

When the algorithm was replaced by a simple chronological listing of posts from friends, which Facebook recently made available to users, it had no measurable impact on polarization. Disabling the reshare option on Facebook, which allows users to quickly share viral posts, led to significantly less news from untrustworthy sources and less political news overall, but it did not result in significant changes to users’ political attitudes.

Similarly, reducing the content that Facebook users saw from accounts with the same ideological alignment did not have a significant effect on polarization, susceptibility to misinformation, or extremist views.

These findings suggest that Facebook users actively seek out content that aligns with their views, and algorithms facilitate this process by making it easier for users to do so, according to David Lazer, a professor at Northeastern University who worked on all four papers.

Eliminating the algorithm altogether reduced the time users spent on Facebook and Instagram, while increasing their time on other platforms like TikTok, YouTube, and others. This highlights the importance of these algorithms to Meta in the competitive landscape of social media.

In response to the papers, Meta’s president for global affairs, Nick Clegg, stated that the findings showed “there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or have any meaningful impact on key political attitudes, beliefs, or behaviors.”

Katie Harbath, Facebook’s former director of public policy, commented on the need for further research on social media and challenged assumptions about its role in American democracy. Harbath was not involved in the research.

The research also uncovered the extent of ideological differences among Facebook users and the different ways conservatives and liberals use the platform to access news and political information.

Conservative Facebook users are more likely to consume content that has been labeled as misinformation by fact-checkers. They also have a greater selection of sources to choose from. The analysis revealed that more websites catering to conservatives than liberals were included in political Facebook posts.

The authors of the papers acknowledged some limitations to their work. The study only covered a few months during the 2020 election and therefore cannot assess the long-term impact of algorithms since their implementation began years ago. They also recognized that most people receive news and information from various sources beyond social media, such as television, radio, the internet, and word-of-mouth, which can also influence their opinions. The news media has been widely criticized in the United States for worsening polarization.

The researchers analyzed data from millions of Facebook and Instagram users and surveyed specific users who agreed to participate. All identifying information about individual users was removed for privacy reasons.

Lazer, the Northeastern professor, initially doubted that Meta would grant the researchers the necessary access, but he was pleasantly surprised. He stated that the conditions imposed by the company were related to reasonable legal and privacy concerns. More studies resulting from this collaboration will be released in the coming months.

“This research is one of a kind,” Lazer said regarding the published research. “There has been a lot of rhetoric about this, but the research has been quite limited in many ways.”

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment