13:10 - 14:50
P3-S71
Room: 1A.02
Chair/s:
Anne Rasmussen
Discussant/s:
Edoardo Alberto Viganò
Opening the Black Box: Which Topics Are Moderated Under Bluesky's Participatory Content Moderation?
P3-S71-4
Presented by: Mia Nahrgang
Mia Nahrgang
University of Konstanz
Content moderation on social media has been described as a notoriously opaque black box, and little is known about how, when, and why platforms moderate. This paper makes use of the fact that the new and rising platform Bluesky provides unprecedented insights into the inner workings of a social media platform's content moderation. As a decentralized social media platform, Bluesky vows to empower users by giving them greater choices in content moderation. Unlike centralized platforms that enforce content moderation through uniform, top-down policies, Bluesky empowers users to customize and define their own standards for how strict or lenient moderation should be on their feeds. Bluesky implements a system of 22 moderation categories, allowing users to decide how content labeled in these categories should be treated. Therefore, the labels are integral to the platform’s moderation framework. They are assigned through a combination of human and algorithmic decisions. Additionally, users are also encouraged to label their content or report missing or false labels on content created by others. However, this participatory approach to content moderation inhibits the danger of being exploited by actors aiming to circumvent or sabotage content moderation or attempting to marginalize perspectives that diverge from their own. Using social media data from Bluesky's API and computational methods, this paper explores which topics and themes are moderated under different content moderation labels and whether labels indeed moderate the content they promise to.
Keywords: Content Moderation, Decentralized Social Media Platforms, Social Media Data, Computational Methods

Sponsors