The CITRIS researcher and UC Davis professor has spent two decades investigating sociopolitical discussion on the internet, with the aim of understanding extremism and breaking ideological silos.

Red versus blue. Conservative values versus liberal principles. The elephant versus the donkey.
The American political landscape has been seen as a two-party battleground for more than a century, and the “us versus them” mindset among Republicans and Democrats has only increased in the last decade, since it was first reported that the majority of each party’s members held “very unfavorable” views of the other party.
Additionally, research has demonstrated that political polarization and ideological entrenchment are still on the rise in the United States. Since the 2024 election, polls have shown that the number of people who identify as moderates within the two parties are decreasing, while those who identify with more extreme viewpoints have grown on either side — contributing to a split nation and a perceived growth in politically motivated violence.
So how do digital platforms play into this complex and highly charged landscape? Is political misinformation running rampant on social media, as has been widely reported by the press? Are social algorithms pushing people into comfortable bubbles, where users are never exposed to opposing views? Magdalena Wojcieszak aims to find out.
Wojcieszak (pronounced “voy-CHEH-shak”), a professor of communication at the University of California, Davis, a PI of an European Research Council (ERC) Consolidator Grant at the Center for Excellence in Social Science at the University of Warsaw, Poland and a researcher at the Center for Information Technology Research in the Interest of Society and the Banatao Institute (CITRIS) at the University of California (UC), has spent more than 20 years studying online interactions about sociopolitical topics, from the so called “echo chambers” of early internet messageboards to the impact of Facebook and Instagram on the U.S. presidential election in 2020 and beyond.
More specifically, Wojcieszak investigates how people select political information in a fragmented online media landscape, and the effects of those choices. As she collects data, she looks for ways to help mitigate issues of misinformation, polarization, and extremism.
“I believe that we just need to open a newspaper or look around to realize that the public in the United States is not very well informed and that those who are politically engaged are very polarized,” she said. “‘How to inform and engage the former? How can we depolarize the latter group, if at all?”
Exploring extremist echo chambers
Wojcieszak earned a master’s degree in sociology at the University of Warsaw in Poland, and — during her semester-long stay in San Francisco mid-2011 — started working on her master’s thesis on propaganda in American television after the Sept. 11, 2001, terrorist attacks.
After a couple of years in the industry in Poland, Wojcieszak moved to the U.S. to pursue a second master’s and a doctorate in communication at the Annenberg School of Communication at the University of Pennsylvania. Once she began her doctoral studies, she had no time for TV.
“I was spending all my time in front of the computer,” Wojcieszak said. “I started gravitating towards this online environment and people’s conversations.”
From behind her laptop screen, Wojcieszak started studying extremist views in online chatrooms and message boards. Her first large study examined prominent white nationalist forums, among them Stormfront, and those of radical environmentalists. She found that these online environments naturally expose people to information that aligns with their beliefs and do not offer alternative ideas or explanations. In other words, people enter an ideological echo chamber, where they only encounter narratives that reinforce their existing views, make them more extreme, lead them to overestimate public support for their radical views, and encourage them to take part in actions that further support and promote neo-Nazi or radical environmental movement.
As she continued her doctoral work, Wojcieszak started to reflect on strategies to make these sealed online groups more open to facts and ideas outside their understanding.
“What would it take,” she said, “to have a racist person actually say, ‘You know, let me have a conversation with a person of color’? How can I break these silos, and how can I open them up to difference?”
After receiving her Ph.D. in 2009, Wojcieszak returned to Europe, where she joined the communications faculty of IE University in Spain and then the University of Amsterdam.
As she continued her academic career, obtaining grants and leading international collaborations on the effects of political disagreement and innovative strategies to minimize extremism and out-group hostility among many different groups (from American partisans to Muslims in the Netherlands or anti-immigration or anti-LGBTQ+ groups in Spain and the US), she soon realized the enormous challenge of changing hearts and minds, especially as the the internet made it easier for people to self-select to biased information.
“I shifted my focus, I was like, ‘Okay, I can’t depolarize those already polarized and prejudiced.’”
Magdalena Wojcieszak
As social media platforms grew in influence, Wojcieszak also began to investigate the algorithms that fuel them.
CITRIS opens a new path
In 2017, a faculty position at UC Davis brought Wojcieszak back to the States. Soon after her arrival on campus, she met Brandie Nonnecke, founding director of the CITRIS Policy Lab, and the two discussed the CITRIS Seed Funding Program over coffee.
“She told me about CITRIS’s priorities, and they were very well aligned with my work and what I deeply care about,” Wojcieszak said.
Wojcieszak received a 2019 CITRIS Seed Award for a multicampus project with UC Berkeley’s Gireeja Ranade, studying automated accounts — also known as bots — and misinformation on Facebook. Together, they sought to understand the prevalence and roles of bots in comment sections, and to identify how bot activity shaped political discussions.
“We were looking at a pretty ambitious plan,” Wojcieszak said. “Soon, it became clear that the key problem of identifying bots with high accuracy could not be easily overcome despite strong expertise in our team of computer scientists.”
While the team was unable to overcome that hurdle by the end of the yearlong CITRIS grant, the work opened a new path for extending Wojcieszak’s longstanding research on political discussion and exposure. Building on questions she had been exploring since her PhD — including in her widely cited work with Diana Mutz at the University of Pennsylvania — she examined how political conversations emerge in ostensibly nonpolitical spaces. While inspecting comment sections on platforms as well as analyzing online behavioral data from her ERC Starting Grant at the University of Amsterdam, she explored the scope of political conversations and exposure to political information in everyday digital environments. For instance, as her earlier research showed, people discuss gender equality on Facebook fan pages for Taylor Swift or debate zoning regulations in Reddit forums about local sports. Magazines such as Women’s Health address issues like abortion, while Sports Illustrated engages questions of race in its coverage. What is the extent and nature of these conversations and exposures?
To address these questions in a valid way, Wojcieszak and a group of doctoral students and junior researchers trained deep learning models to identify whether articles from news and non-news websites visited by over 7,000 participants in three countries were about politics. In addition, her teams scraped comment data from explicitly political and overtly non-political Reddit, Facebook, and YouTube channels to classify users’ expressions and interactions.
Their research established that people do indeed discuss sociopolitical topics widely in non-political settings. And, to the research team’s great interest, the discussions in those spaces tend to be healthier.
“The commenters have lower incivility, lower toxicity and greater curiosity about what other people are saying,” Wojcieszak said. In addition, her team found that citizens encounter politics more frequently outside of news than within news. Out of every 10 visits to political content, 3.4 come from news and 6.6 from non-news sites. Furthermore, exposure to political content outside news domains had the same – and in some cases stronger – associations with key democratic attitudes and behaviors as news exposure.
Wojcieszak says she’s benefited greatly from working with CITRIS, with the 2019 Seed Award allowing her to approach an ambitious question and uncover new areas of research. The experience also introduced her to a new network of potential collaborators.
“More than the funding, it was also the exposure and connection to our multidisciplinary and cross-campus scholars.”
Magdalena Wojcieszak
Investigating Meta’s influence on the 2020 elections
In 2020, Wojcieszak was selected as one of 17 external scholars to partner with social media giant Meta on a multiyear investigation of how its flagship Facebook and Instagram platforms may have affected political attitudes and behaviors during the 2020 U.S. elections.
The researchers used an array of methods, including surveys, behavioral data analysis and controlled experimental designs, to collect information on political participation, polarization, knowledge and misperceptions, and trust in U.S. democratic institutions. To date, more than 15 peer-reviewed publications have stemmed from the partnership, including papers published in the highly regarded journals Science, Nature and the Proceedings of the National Academy of Sciences (PNAS).
“This collaboration was the most direct and large-scale test of the influence of tweaks to platform algorithms on users’ exposure, engagement and political attitudes,” Wojcieszak said.
The team’s findings were complicated, and by some measures, their conclusions contradict the popular conception that social media platforms are driving people toward more extreme sociopolitical viewpoints and actions.
Instead, although Facebook and Instagram algorithms strongly influence the news and discussions that people encounter, the researchers found “no measurable effects” on users’ political beliefs or behaviors assessed through over time surveys.
For example, one experiment reduced exposure to content from like-minded sources on Facebook during the 2020 US presidential election. Although this intervention increased users’ exposure to content from cross-cutting sources and decreased exposure to uncivil language, it had no effects on various attitudinal measures such as affective polarization, ideological extremity, candidate evaluations and belief in false claims. In that work, the researchers also found that content from ‘like-minded’ sources constitutes the majority of what people see on the platform, although political information and news represent only a small fraction of these exposures among the entire population of active adult Facebook users in the USA.
Cultivating the next generation of communication scholars
A fellow of the International Communication Association and a distinguished research fellow at the Annenberg Public Policy Center at the University of Pennsylvania, Wojcieszak has authored more than 90 peer-reviewed papers, served twice as a journal editor, and won awards for her teaching and scholarship. Among those inspired by her work is PhD student Wonjeong “Claire” Jo, a former journalism student from Korea.
While pursuing a master’s degree at Kyung Hee University, Jo saw her interests shift from reporting to political communication research, particularly discussions about political polarization. She wrote her thesis on cross-cutting exposure, or the exposure to political beliefs that disagree with one’s own, and repeatedly encountered Wojcieszak’s work, citing her over 10 times in her thesis.
Once she received her degree, Jo applied for the UC Davis communications doctoral program to work with Wojcieszak as her advisor. Three years later, she says Wojcieszak is the best mentor she’s had in years.
As an introvert, Jo had found it difficult to voice her ideas with confidence. She said that Wojcieszak has helped her overcome her reservations with constant encouragement.
“She is a strong women’s scholar who bravely shares her ideas and speaks up really well,” Jo said. “Those are qualities that I didn’t naturally have, but I really admire and want to continue learning from.
“She has really helped me to build my confident personality in academia.”
Wonjeong “Claire” Jo
With Wojcieszak’s guidance, Jo completed — among other works — a paper about the factors that cause people to reject cross-cutting news. The study won a top student paper award in the political communication division at the National Communication Association’s 2025 annual conference.
Identifying beacons in a sea of misinformation
Over the last few years, a new factor has muddled the online media landscape — not just for researchers like Wojcieszak, but for all users. Artificial intelligence (AI), in the form of the proprietary recommendation algorithms that Wojcieszak herself has studied, have long shaped the types of content delivered to people on social media.
To try to make people more resilient to these new (AI-generated content, fake interactions) and old (misinformation, hyperpartisan content, anger- and fear-generating political rhetoric) concerns, Wojcieszak is identifying ways to encourage people to consume verified news sources online and on social media platforms.
Outside of hyper-partisan or misinformative echo chambers, she said, “the problem is not that people consume a lot of “bad” political content.
“The problem is that people don’t consume any at all.”
Wojcieszak’s recent research has shown that “nudging” YouTube algorithms to promote a healthy diet of quality, fact-checked content encourages users to consume more news on platforms, that engaging Twitter users with news through their non-political interests increases the liking and following of news content, and that incentivizing users to follow news accounts on Instagram and WhatsApp makes people better at discerning true stories from false — a sort of inoculation against misinformation — while also building their trust in news media and journalists.
With the support of nearly 2 million euros from the European Research Council, Wojcieszak will have the opportunity to further refine and test that research. With the five-year NEWSUSE project, she and her team will design computational tools to increase people’s consumption of quality news and test the democratic effects of increased news consumption across political contexts and social media platforms.
“I do believe that people who consume more factual and verified news have a basic storage of knowledge and information that could make them more resilient to misinformation or hyperpartisan rhetoric and — as such in the long term — help make American and other global democracies stronger.”