Testing Different Strategies to Reveal Biased News Coverage in an Online Field Experiment
P11-4
Presented by: Kim Heinser
Balanced and accurate news coverage critically matters for political discourse and public opinion formation, especially in times of increasingly polarized societies. Scholarship in the past years has largely focused on the impact of disinformation and corresponding intervention and mitigation strategies. More recent studies suggest though that, often subtly, biased media coverage is both more frequent and potentially much more powerful in influencing public discourse. At the same time, no systematic evaluation of counter-bias strategies exist that could help news consumers to better recognize biases and arrive at more balanced views about salient political issues. In this study, we build on the long-standing literature in the social and behavioral science on media bias paired with state-of-the-art methodologies from computer science for automated media bias recognition. Using a custom-built news aggregator platform that automatically annotates bias in U.S. online news articles, we systematically evaluate different visualization and annotation strategies in a large-scale, pre-registered conjoint online field experiment. Specifically, we test the effects of different side-by-side article overviews views akin to the platform allsides.com alongside different detailed views that reveal bias through in-text highlighting on the perception of bias in the specific articles our participants are exposed to. The experiment randomizes over five salient political topics, ensuring that our insights are not driven by issue-specific perceptions and we elicit a range of standard individual-level covariates that allow us to systematically explore inhomogeneous treatment effects. Results of our pre-studies suggest that (relative) article overviews better enable consumers to recognize bias than detailed in-text highlighting.