Content Moderation Academic Research

From the IFTAS Moderator Library, supporting Fediverse trust & safety

更新 2024-04-30

Journals

  • Journal of Online Trust and Safety: a no fee, fast peer review, and open access journal. Authors may submit letters of inquiry to assess whether their manuscript is a good fit. Priority areas for the journal include: child exploitation and non-consensual intimate imagery, suicide and self-harm, incitement and terrorism, hate speech and harassment, spam and fraud, misinformation and disinformation.

Reference Libraries

Selected Research

Trust & Safety Teaching Consortium Reading List: reading list and associated slide decks, recorded lectures, and exercises

Behavior Change in Response to Subreddit Bans and External Events: As more people flock to social media to connect with others and form virtual communities, it is important to research how members of these groups interact to understand human behavior on the Web. In response to an increase in hate speech, harassment, and other antisocial behaviors, many social media companies have implemented different content and user moderation policies. On Reddit, for example, communities, i.e., subreddits, are occasionally banned for violating these policies. We study the effect of these regulatory actions as well as when a community experiences a significant external event such as a political election or a market crash.

Deplatforming: Following extreme Internet celebrities to Telegram and alternative social media: Extreme, anti-establishment actors are being characterized increasingly as ‘dangerous individuals’ by the social media platforms that once aided in making them into ‘Internet celebrities’. These individuals (and sometimes groups) are being ‘deplatformed’ by the leading social media companies such as Facebook, Instagram, Twitter and YouTube for such offences as ‘organised hate’. Deplatforming has prompted debate about ‘liberal big tech’ silencing free speech and taking on the role of editors, but also about the questions of whether it is effective and for whom. The research reported here follows certain of these Internet celebrities to Telegram as well as to a larger alternative social media ecology.

Disproportionate Removals and Differing Content Moderation Experiences for Conservative, Transgender, and Black Social Media Users: Marginalization and Moderation Gray Areas: Social media sites use content moderation to attempt to cultivate safe spaces with accurate information for their users. However, content moderation decisions may not be applied equally for all types of users, and may lead to disproportionate censorship related to people’s genders, races, or political orientations. We conducted a mixed methods study involving qualitative and quantitative analysis of survey data to understand which types of social media users have content and accounts removed more frequently than others, what types of content and accounts are removed, and how content removed may differ between groups.

Gamergate and The Fappening: How Reddit’s algorithm, governance, and culture support toxic technocultures: This article considers how the social-news and community site Reddit.com has become a hub for anti-feminist activism. Examining two recent cases of what are defined as “toxic technocultures” (#Gamergate and The Fappening), this work describes how Reddit’s design, algorithm, and platform politics implicitly support these kinds of cultures. In particular, this piece focuses on the ways in which Reddit’s karma point system, aggregation of material across subreddits, ease of subreddit and user account creation, governance structure, and policies around offensive content serve to provide fertile ground for anti-feminist and misogynistic activism.

Modular Politics: Toward a Governance Layer for Online Communities: Governance in online communities is an increasingly high-stakes challenge, and yet many basic features of offline governance legacies-juries, political parties, term limits, and formal debates, to name a few-are not in the feature-sets of the software most community platforms use. Drawing on the paradigm of Institutional Analysis and Development, this paper proposes a strategy for addressing this lapse by specifying basic features of a generalizable paradigm for online governance called Modular Politics.

The Digital Covenant: Non-Centralized Platform Governance on the Mastodon Social Network: The majority of scholarship on platform governance focuses on for-profit, corporate social media with highly centralized network structures. Instead, we show how non-centralized platform governance functions in the Mastodon social network. Through an analysis of survey data, Github and Discourse developer discussions, Mastodon Codes of Conduct, and participant observations, we argue Mastodon’s platform governance is an exemplar of the covenant, a key concept from federalist political theory.

Volunteer Work: Mapping the Future of Moderation Research: Research on the governance of online communities often requires exchanges and interactions between researchers and moderators. While a growing body of work has studied commercial content moderation in the context of platform governance and policy enforcement, only a small number of studies have begun to explore the work of unpaid, volunteer community moderators who manage the millions of different subcommunities that exist on platforms.

Was this page helpful?
简体中文