Meta’s Election Research Raises Concerns and Sparks Further Inquiry

In the lead-up to the 2020 presidential election, Meta conducted a series of ambitious studies on the effects of its platforms, Facebook and Instagram, on the political beliefs of US-based users. Independent researchers from multiple universities were granted unprecedented access to Meta’s data and given the authority to modify the feeds of thousands of people to observe their behavior.

While the researchers were not paid by Meta, the company expressed satisfaction with the results. These findings were released in four papers in Nature and Science. According to Nick Clegg, Meta’s president of global affairs, the experimental findings contribute to an expanding body of research indicating that Meta’s platforms alone do not cause harmful “affective” polarization and do not significantly impact political views and behavior.

Although the conclusion seems far-reaching, the studies have specific limitations. While the researchers had greater insight into Meta’s platforms than ever before (as Meta deemed the data too sensitive to disclose publicly for many years), the studies revealed as many questions as they answered.

The studies focused on the three months leading up to the 2020 presidential election. While this duration is longer than what most researchers receive, it cannot fully represent a user’s overall experience on the platform. Andrew Guess, one of the researchers and an assistant professor at Princeton, explained during a press briefing that there is uncertainty regarding the outcome if the studies were conducted over a year or two. Furthermore, the studies do not consider the fact that many users have had Facebook and Instagram accounts for over a decade.

Additionally, the specific time frame of the study was the lead-up to an election marked by intense political polarization, introducing a potential bias. Michael Wagner, a professor at the University of Wisconsin-Madison who oversaw Meta’s 2020 election project, highlights the need to explore whether these effects persist outside of the election context.

Meta’s Clegg also suggested that the research challenges the belief that content resharing on social media contributes to polarization. However, the researchers’ standpoint is not as clear-cut. One study published in Science uncovered that resharing promotes “content from untrustworthy sources.” The same study demonstrated that most misinformation caught by the platform’s third-party fact checkers is concentrated among conservative users and is exclusive to them, lacking equivalent prevalence among the opposing political side based on an analysis of approximately 208 million users.

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Swift Telecast is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – swifttelecast.com. The content will be deleted within 24 hours.

Leave a Comment