Understanding Community-Level Blocklists in Decentralized Social Media

📅 2025-06-05
📈 Citations: 0
✹ Influential: 0
📄 PDF
đŸ€– AI Summary
This study investigates the practice logic, governance tensions, and design requirements of community-level blocking—implemented via the ActivityPub protocol—in the decentralized social platform Mastodon. Through content analysis and in-depth interviews with 12 community moderators, we systematically identify three key trade-offs: (1) heterogeneous blocking targets, (2) tension between proactive safety measures and reactive enforcement, and (3) non-negligible risks of overblocking. We further find that current blocking mechanisms lack critical capabilities—including categorized filtering, collaborative decision-making, and feedback loops. Our contributions are threefold: (1) the first qualitative analytical framework for community-level blocking practices; (2) three actionable interaction design proposals—comment receipts, tiered filtering, and collaborative voting—to enhance transparency and accountability; and (3) empirically grounded insights and operational pathways for decentralized content governance.

Technology Category

Application Category

📝 Abstract
Community-level blocklists are key to content moderation practices in decentralized social media. These blocklists enable moderators to prevent other communities, such as those acting in bad faith, from interacting with their own -- and, if shared publicly, warn others about communities worth blocking. Prior work has examined blocklists in centralized social media, noting their potential for collective moderation outcomes, but has focused on blocklists as individual-level tools. To understand how moderators perceive and utilize community-level blocklists and what additional support they may need, we examine social media communities running Mastodon, an open-source microblogging software built on the ActivityPub protocol. We conducted (1) content analysis of the community-level blocklist ecosystem, and (2) semi-structured interviews with twelve Mastodon moderators. Our content analysis revealed wide variation in blocklist goals, inclusion criteria, and transparency. Interviews showed moderators balance proactive safety, reactive practices, and caution around false positives when using blocklists for moderation. They noted challenges and limitations in current blocklist use, suggesting design improvements like comment receipts, category filters, and collaborative voting. We discuss implications for decentralized content moderation, highlighting trade-offs between openness, safety, and nuance; the complexity of moderator roles; and opportunities for future design.
Problem

Research questions and friction points this paper is trying to address.

Understanding moderators' use of community-level blocklists in decentralized social media
Exploring challenges and limitations in current blocklist utilization
Proposing design improvements for decentralized content moderation tools
Innovation

Methods, ideas, or system contributions that make the work stand out.

Analyzing Mastodon community-level blocklists ecosystem
Interviewing moderators on blocklist usage challenges
Proposing design improvements for decentralized moderation
🔎 Similar Papers
No similar papers found.