Despite Facebook’s criticism for promoting extreme political polarization, new research suggests that the problem may not solely stem from the algorithm.
In four studies published in academic publications Science and Nature, researchers from Princeton University, Dartmouth College, and the University of Texas collaborated with Meta to examine the impact of social media on democracy and the 2020 presidential election.
related investing news

The authors, who obtained direct access to certain Facebook and Instagram data for their research, depict a vast social network where users tend to seek news and information that aligns with their existing beliefs. Therefore, the issue of echo chambers arises not only from the company’s recommendation algorithms but also from the content users actively search for.
In one of the studies in Science, the researchers demonstrated the impact of exposing Facebook and Instagram users to content through a chronological feed instead of an algorithm-powered feed.
During the three-month period, this change “did not significantly alter levels of issue polarization, affective polarization, political knowledge, or other key attitudes,” as stated by the authors.
In another Science article, researchers noted that “Facebook, as a social and informational setting, is substantially segregated ideologically — far more than previous research on internet news consumption based on browsing behavior has found.”
In each of the new studies, the authors stated that Meta was involved with the research, but the company did not provide payment, and the researchers had the freedom to publish their findings without interference.
One study published in Nature analyzed the concept of echo chambers on social media. It focused on a subset of over 20,000 adult Facebook users in the U.S. who opted into the research over a three-month period surrounding the 2020 presidential election.
The authors discovered that the average Facebook user receives approximately half of their content from sources that share their beliefs. When attempting to diversify the content these users were exposed to, it did not change their views significantly.
“These results are not consistent with the worst fears about echo chambers,” the authors concluded. “However, the data clearly indicate that Facebook users are much more likely to see content from like-minded sources than they are to see content from cross-cutting sources.”
The researchers concur that Facebook has a polarization problem, but the issue remains whether the algorithm intensifies this problem.
One of the Science papers found that regarding news, “both algorithmic and social amplification play a part” in driving a wedge between conservatives and liberals, resulting in “increasing ideological segregation.”
“Sources favored by conservative audiences were more prevalent on Facebook’s news ecosystem than those favored by liberals,” stated the authors, who added that “most sources of misinformation are favored by conservative audiences.”
In an accompanying editorial, Science’s editor-in-chief Holden Thorp mentioned that the data from the studies demonstrate that “the news fed to liberals by the engagement algorithms was very different from that given to conservatives, which was more politically homogeneous.”
Thorp further commented that “Facebook may have already done such an effective job of getting users addicted to feeds that satisfy their desires that they are already segregated beyond alteration.”
After facing years of criticism for spreading misinformation during past U.S. elections, Meta attempted to present the results in a positive light.
In a blog post, Nick Clegg, Meta’s president of global affairs, stated that the studies “provide new insights on the claim that the way content is surfaced on social media — and by Meta’s algorithms specifically — keeps people divided.”
Clegg wrote, “Although questions about social media’s impact on key political attitudes, beliefs, and behaviors are not fully settled, the experimental findings add to a growing body of research showing there is little evidence that key features of Meta’s platforms alone cause harmful ‘affective’ polarization or have meaningful effects on these outcomes.”
However, some authors involved in the studies acknowledged the need for further research on the recommendation algorithms of Facebook and Instagram and their societal impact. The studies were based on data gathered during a specific, short time frame coinciding with the 2020 presidential election, and additional research may unveil more details.
Stephan Lewandowsky, a psychologist from the University of Bristol, reviewed the findings and responded to Science as part of the publication’s package. Although not directly involved in the studies, he described them as “huge experiments” that demonstrate “that you can change people’s information diet, but you’re not going to immediately move the needle on these other things.”
Lewandowsky pointed out that Meta’s participation in the study could influence how people interpret the findings, stating, “What they did with these papers is not complete independence.”
Watch: CNBC’s full interview with Meta chief financial officer Susan Li