"I thought it was my mistake, but it's really the design'': A Critical Examination of the Accessibility of User-Enacted Moderation Tools on Facebook and X

๐Ÿ“… 2025-09-12
๐Ÿ“ˆ Citations: 0
โœจ Influential: 0
๐Ÿ“„ PDF
๐Ÿค– AI Summary
This study identifies critical accessibility deficits in Facebook and Xโ€™s content moderation tools (e.g., reporting, blocking, filtering) for blind users and the resulting multifaceted burdens. Using semi-structured interviews and task-based walkthroughs grounded in HCI accessibility evaluation frameworks, we systematically uncover three dimensions of โ€œsecurity work administrative burdenโ€: learning cost, operational cost, and psychological cost. We introduce the novel analytical lens of *security work administrative burden*, construct the first cross-platform taxonomy of accessibility barriers in moderation interfaces, and derive actionable design recommendations. The findings advance theoretical understanding at the intersection of digital safety and accessibility, while providing empirically grounded, practical guidance for platform designers to lower barriers to safe participation among marginalized users.

Technology Category

Application Category

๐Ÿ“ Abstract
As social media platforms increasingly promote the use of user-enacted moderation tools (e.g., reporting, blocking, content filters) to address online harms, it becomes crucially important that such controls are usable for everyone. We evaluate the accessibility of these moderation tools on two mainstream platforms -- Facebook and X -- through interviews and task-based walkthroughs with 15 individuals with vision impairments. Adapting the lens of emph{administrative burden of safety work}, we identify three interleaved costs that users with vision loss incur while interacting with moderation tools: emph{learning costs} (understanding what controls do and where they live), emph{compliance costs} (executing multi-step procedures under screen reader and low-vision conditions), and emph{psychological costs} (experiencing uncertainty, stress, and diminished agency). Our analysis bridges the fields of content moderation and accessibility in HCI research and contributes (1) a cross-platform catalog of accessibility and usability breakdowns affecting safety tools; and (2) design recommendations for reducing this burden.
Problem

Research questions and friction points this paper is trying to address.

Evaluating accessibility of moderation tools for vision-impaired users
Identifying learning, compliance, and psychological costs for disabled users
Proposing design improvements to reduce administrative burden of safety work
Innovation

Methods, ideas, or system contributions that make the work stand out.

Evaluated accessibility via interviews and task-based walkthroughs
Identified learning, compliance, psychological costs for vision impairments
Proposed design recommendations to reduce moderation burden
๐Ÿ”Ž Similar Papers
No similar papers found.
S
Sudhamshu Hosamane
Rutgers University, USA
A
Alyvia Walters
Villanova University, USA
Y
Yao Lyu
University of Michigan, USA
Shagun Jhaver
Shagun Jhaver
Rutgers University
social computingonline communitiescontent moderation