Marginalized users' 'folk theories' reveal platform moderation biases
Category: User-Centred Design · Effect: Moderate effect · Year: 2023
Marginalized social media users develop intuitive explanations, or 'folk theories,' for content moderation decisions that often highlight systemic biases and misalignments with community guidelines.
Design Takeaway
Actively listen to and analyze the intuitive explanations marginalized users create for moderation outcomes, as these 'folk theories' are critical indicators of systemic bias and areas for improvement.
Why It Matters
Understanding these user-generated theories provides direct insight into how moderation systems are perceived and experienced by those most affected. This can help design teams identify and rectify unfair or opaque moderation practices, fostering a more equitable and trustworthy platform environment.
Key Finding
Marginalized users create their own explanations for why their content gets moderated, often pointing to unfairness and bias in the platform's rules and enforcement.
Key Findings
- Marginalized users develop folk theories to explain content moderation outcomes.
- These theories are informed by users' perceptions of the platform's 'spirit' and their marginalized identities.
- Folk theories often address content removal despite apparent adherence to guidelines and perceived bias against marginalized users within moderation policies.
Research Evidence
Aim: How do marginalized social media users develop folk theories to explain content moderation, and how do these theories relate to their perceptions of platform spirit and marginalized identities?
Method: Qualitative Interview Study
Procedure: Researchers conducted in-depth interviews with marginalized social media users to explore their experiences with content moderation, their explanations for moderation decisions, and their overall perceptions of social media platforms.
Sample Size: 24 participants
Context: Social Media Platforms
Design Principle
Design moderation systems that are transparent, equitable, and demonstrably aligned with community values as perceived by all user groups, especially marginalized ones.
How to Apply
When designing or evaluating content moderation systems, conduct qualitative research with diverse user groups to uncover their 'folk theories' about how moderation works and identify potential areas of bias or unfairness.
Limitations
The study relies on self-reported experiences and perceptions, which may be subject to individual interpretation and recall bias. The findings may not be generalizable to all social media platforms or all types of marginalized users.
Student Guide (IB Design Technology)
Simple Explanation: People who are often treated unfairly on social media make up their own rules to understand why their posts get taken down, and these rules often show that the platform's real rules aren't fair.
Why This Matters: This research shows that understanding how users *actually* perceive and explain system functions, especially when they feel unfairly treated, is crucial for designing better and more equitable digital products.
Critical Thinking: How might the concept of 'platform spirit' influence the development of folk theories, and what are the ethical implications for designers when these spirits are perceived as discriminatory?
IA-Ready Paragraph: This design project explored how users develop 'folk theories' to explain system behaviors, particularly in contexts where they perceive unfairness. For instance, marginalized social media users create intuitive explanations for content moderation decisions that often highlight perceived biases within platform guidelines and enforcement, demonstrating a critical user-centered perspective on system design.
Project Tips
- When researching user experiences, consider how different groups might develop unique ways of understanding system behaviors.
- Look for patterns in user explanations that go beyond official documentation or stated policies.
How to Use in IA
- Use the concept of 'folk theories' to frame your analysis of user perceptions of a design's functionality or limitations.
- Incorporate qualitative data from users to support claims about how a design's features are interpreted and explained by different user groups.
Examiner Tips
- Demonstrate an understanding of how users develop their own mental models and explanations for system operations, particularly in contexts of perceived unfairness.
- Connect user-generated explanations to broader design principles of fairness, transparency, and inclusivity.
Independent Variable: ["Marginalized identity","Content moderation experiences"]
Dependent Variable: ["Development of folk theories","Perceptions of platform spirit"]
Controlled Variables: ["Specific social media platform","Types of content moderated"]
Strengths
- Focuses on underrepresented user groups.
- Utilizes qualitative methods to capture nuanced user experiences.
Critical Questions
- To what extent do these folk theories reflect actual system design flaws versus user misinterpretations?
- How can platforms proactively solicit and integrate these folk theories into their design and moderation processes?
Extended Essay Application
- Investigate how users of a specific technology (e.g., an accessibility tool, a community forum) develop 'folk theories' to explain its functionality or limitations, especially if they belong to a marginalized group.
- Analyze how these user-generated explanations reveal potential design flaws or areas for improved user support.
Source
Content Moderation Folk Theories and Perceptions of Platform Spirit among Marginalized Social Media Users · ACM Transactions on Social Computing · 2023 · 10.1145/3632741