The powerful algorithms used by Facebook and Instagram have increasingly been criticized for amplifying misinformation and political polarization. However, a series of groundbreaking studies published on Thursday suggest that addressing these challenges will require more than just making changes to the platforms’ software.
The research papers, published in Science and Nature, also reveal the extent of political echo chambers on Facebook. Conservatives and liberals rely on different sources of information, interact with opposing groups, and consume varying amounts of misinformation.
In collaboration with Meta, the researchers analyzed data from millions of Facebook and Instagram users during the 2020 US presidential election. They also surveyed specific users who agreed to participate.
One area of investigation focused on the algorithms used in social media feeds and how they impact voters’ attitudes and behavior. While algorithms are effective at keeping users engaged, they have been criticized for exacerbating political divisions and spreading misinformation. Regulating these systems is a popular suggestion for addressing these issues.
However, when the researchers changed the algorithms for some users during the 2020 election, they observed little difference in the outcomes.
Talia Jomini Stroud, director of the Center for Media Engagement at the University of Texas at Austin and one of the leaders of the studies, stated, “We find that algorithms are extremely influential in people’s on-platform experiences and there is significant ideological segregation in political news exposure. We also find that popular proposals to change social media algorithms did not sway political attitudes.”
Changing the algorithm to a simple chronological listing of posts or turning off Facebook’s reshare option had no measurable impact on polarization. Reducing the content from ideologically aligned accounts also did not significantly affect polarization or susceptibility to misinformation.
These findings suggest that Facebook users actively seek out content that aligns with their views, and the algorithms make it easier for them to do so. Removing the algorithm altogether resulted in users spending less time on Facebook and Instagram and more time on other social media platforms like TikTok and YouTube.
The research also uncovered the extent of ideological differences among Facebook users and how conservatives and liberals use the platform to consume political news. Conservative users are more likely to consume labeled misinformation and have access to more conservative sources.
It is important to note that the research has limitations, as it only covered a few months during the 2020 election. Additionally, people’s opinions are influenced by various sources, not just social media.
While the findings challenge assumptions about the role of social media in American democracy, critics argue that social media companies should still take responsibility for combating misinformation.
More studies from the collaboration with Meta will be released in the future.