Last week, a draft decision for Dobbs v. Jackson Women’s Health Organization was leaked from the Supreme Court and shows that five justices are preparing a judgment that would strike down Roe v. Wade. For reporters covering this ongoing story,…
How to stop misinformation on social media
Syracuse University Professor Jennifer Stromer-Galley has been studying social media before it was called social media. Five years ago, she laid out a simple three-point plan to help stem the tide of misinformation on Facebook. Today, those three recommendations remain relevant after a former Facebook employee revealed internal documents that indicate the company was lying about its progress against hate, violence and misinformation on its platform.
Stromer-Galley’s plan, outlined in the piece Three ways Facebook could reduce fake news without resorting to censorship and published by The Conversation, had these three recommendations to fight misinformation.
Option 1: Nudging
“One option Facebook could adopt involves using existing lists identifying prescreened reliable and fake-news sites. The site could then alert those who want to share a troublesome article that its source is questionable.”
Option 2: Crowdsourcing
“Facebook could also use the power of crowdsourcing to help evaluate news sources and indicate when news that is being shared has been evaluated and rated. One important challenge with fake news is that it plays to how our brains are wired. We have mental shortcuts, called cognitive biases, that help us make decisions when we don’t have quite enough information (we never do), or quite enough time (we never do). Generally, these shortcuts work well for us as we make decisions on everything from which route to drive to work to what car to buy But, occasionally, they fail us. Falling for fake news is one of those instances.”
Option 3: Algorithmic social distance
“The third way that Facebook could help would be to reduce the algorithmic bias that presently exists in Facebook. The site primarily shows posts from those with whom you have engaged on Facebook. In other words, the Facebook algorithm creates what some have called a filter bubble, an online news phenomenon that has concerned scholars for decades now. If you are exposed only to people with ideas that are like your own, it leads to political polarization: Liberals get even more extreme in their liberalism, and conservatives get more conservative.”
To schedule an interview with Professor Stromer-Galley, please contact Ellen James Mbuqe, director of media relations at Syracuse University, at firstname.lastname@example.org or 412-496-0551.
Stromer-Galley is the author of “Presidential Campaigning in the Internet Age” and chief investigator for Illuminating 2020, a website dedicated to helping journalists cover US political campaigns. The website provides an interactive database for easy and quick tracking of what candidates are saying on Facebook and Twitter through campaign accounts and paid ads. She is also the Senior Associate Dean for Academic and Faculty Affairs, and Director for the Center for Computational and Data Science at Syracuse University’s iSchool.