Miguel Guzman ’24, a native of Lima, Peru, is a senior biotechnology major in the College of Arts and Sciences with an entrepreneurship and emerging enterprises minor in the Whitman School of Management. His research centers on developing bio-enabled protein…
New Research Examines Echo Chambers and Political Attitudes on Social Media
What is the role of social media in shaping our political attitudes? New research published in Nature sets out to understand whether and how the information people see on social media shapes their political views. Entitled “Like-minded Sources on Facebook Are Prevalent but Not Polarizing,” this groundbreaking research uses an on-platform experiment to examine what happens when Facebook users see dramatically less content from people who share their political leanings.
The lead researchers — Professors Brendan Nyhan from Dartmouth University, Jaime Settle from William & Mary, Emily Thorson from Syracuse University and Magdalena Wojcieszak from University of California, Davis – ran a study for three months in 2020 that reduced the volume of content from politically like-minded sources in the Feeds of consenting participants.
The researchers found that the majority of Facebook users’ News Feeds consists of posts from politically like-minded sources, while political information and news represent only a small fraction of their feeds.
In addition to decreasing exposure to content from like-minded sources, the experimental intervention also resulted in a decrease in exposure to uncivil language and an increase in exposure to posts from sources with politically dissimilar views.
However, the researchers found that these changes to a person’s Facebook feed had no impact on a variety of beliefs and attitudes, including affective polarization, ideological extremity, and beliefs in false claims.
“These results underscore how hard it is to change political opinions,” said Emily Thorson, an assistant professor of political science in the Maxwell School at Syracuse University. “In addition, it’s important to emphasize that social media still comprises a relatively small part of most people’s information diets. As a result, even drastic changes to what they see on platforms may not have downstream effects on their attitudes.” Thorson’s research focuses on political misperceptions and political knowledge.
These findings are part of a broader research project examining the role of social media in U.S. democracy. Known as the U.S. 2020 Facebook and Instagram Study, the project is the first of its kind providing social media scientists with access to social media data that previously has been largely inaccessible.
Seventeen academics from U.S. colleges and universities, including Syracuse University, teamed up with Meta to conduct independent research on what people see on social media and how it affects them. The project built in several safeguards to protect the researchers’ independence. All the studies were preregistered, and eta could not restrict or censor the findings. The academic lead authors had final authority on all writing and research decisions.
The research for “Like-minded Sources on Facebook Are Prevalent but Not Polarizing” was divided into two parts.
From June to September 2020, the researchers measured how often all adult Facebook users saw content from politically aligned sources. The results showed that for the median Facebook user, slightly over half the content they saw was from politically like-minded sources, and just 14.7% was from sources with different political leanings.
In September to December 2020, the researchers conducted a multi-wave experiment with 23,377 consenting adult users of Facebook in the US. The study reduced the volume of content from like-minded sources to gauge the effect on political attitudes. People in the treatment group saw about one-third less content from like-minded sources. In the treatment group, total engagement with content from like-minded sources decreased, but their rate of engagement increased: when they did see content from like-minded sources, they were more likely to click on it. This pattern illustrates human behavior compensating for algorithmic changes.
Additional studies that are part of this project are “Asymmetric ideological segregation in exposure to political news on Facebook,” “How do social media feed algorithms affect attitudes and behavior in an election campaign?,” and “Reshares on Social Media Amplify Political News but Do Not Detectably Affect Beliefs or Opinions,”