Marginalized users' 'folk theories' reveal platform moderation biases

Category: User-Centred Design · Effect: Moderate effect · Year: 2023

Marginalized social media users develop intuitive explanations, or 'folk theories,' for content moderation decisions that often highlight systemic biases and misalignments with community guidelines.

Design Takeaway

Actively listen to and analyze the intuitive explanations marginalized users create for moderation outcomes, as these 'folk theories' are critical indicators of systemic bias and areas for improvement.

Why It Matters

Understanding these user-generated theories provides direct insight into how moderation systems are perceived and experienced by those most affected. This can help design teams identify and rectify unfair or opaque moderation practices, fostering a more equitable and trustworthy platform environment.

Key Finding

Marginalized users create their own explanations for why their content gets moderated, often pointing to unfairness and bias in the platform's rules and enforcement.

Key Findings

Research Evidence

Aim: How do marginalized social media users develop folk theories to explain content moderation, and how do these theories relate to their perceptions of platform spirit and marginalized identities?

Method: Qualitative Interview Study

Procedure: Researchers conducted in-depth interviews with marginalized social media users to explore their experiences with content moderation, their explanations for moderation decisions, and their overall perceptions of social media platforms.

Sample Size: 24 participants

Context: Social Media Platforms

Design Principle

Design moderation systems that are transparent, equitable, and demonstrably aligned with community values as perceived by all user groups, especially marginalized ones.

How to Apply

When designing or evaluating content moderation systems, conduct qualitative research with diverse user groups to uncover their 'folk theories' about how moderation works and identify potential areas of bias or unfairness.

Limitations

The study relies on self-reported experiences and perceptions, which may be subject to individual interpretation and recall bias. The findings may not be generalizable to all social media platforms or all types of marginalized users.

Student Guide (IB Design Technology)

Simple Explanation: People who are often treated unfairly on social media make up their own rules to understand why their posts get taken down, and these rules often show that the platform's real rules aren't fair.

Why This Matters: This research shows that understanding how users *actually* perceive and explain system functions, especially when they feel unfairly treated, is crucial for designing better and more equitable digital products.

Critical Thinking: How might the concept of 'platform spirit' influence the development of folk theories, and what are the ethical implications for designers when these spirits are perceived as discriminatory?

IA-Ready Paragraph: This design project explored how users develop 'folk theories' to explain system behaviors, particularly in contexts where they perceive unfairness. For instance, marginalized social media users create intuitive explanations for content moderation decisions that often highlight perceived biases within platform guidelines and enforcement, demonstrating a critical user-centered perspective on system design.

Project Tips

How to Use in IA

Examiner Tips

Independent Variable: ["Marginalized identity","Content moderation experiences"]

Dependent Variable: ["Development of folk theories","Perceptions of platform spirit"]

Controlled Variables: ["Specific social media platform","Types of content moderated"]

Strengths

Critical Questions

Extended Essay Application

Source

Content Moderation Folk Theories and Perceptions of Platform Spirit among Marginalized Social Media Users · ACM Transactions on Social Computing · 2023 · 10.1145/3632741