Mental health apps can cause unintentional harm through misdiagnosis and lack of support.

Category: User-Centred Design · Effect: Strong effect · Year: 2024

User comments reveal that mental health applications, despite their intended benefits, can lead to negative psychological outcomes due to inaccurate assessments and insufficient guidance.

Design Takeaway

Design digital mental health tools with a strong emphasis on user safety, ethical considerations, and clear guidance towards appropriate professional support, rather than relying solely on automated assessments.

Why It Matters

Designers of digital health tools must prioritize user well-being beyond mere functionality. Understanding the potential for unintended negative consequences is crucial for developing responsible and ethical applications that support rather than harm users.

Key Finding

Analysis of user reviews for depression apps highlighted that these applications can cause harm by providing inaccurate diagnoses, offering insufficient support, oversimplifying complex conditions, and raising privacy concerns.

Key Findings

Research Evidence

Aim: What are the unintentional harms experienced by users of depression self-management applications?

Method: Thematic analysis of user reviews

Procedure: Researchers analyzed 6,253 user comments from 36 depression self-management applications available on Google Play and Apple App stores to identify recurring themes related to user-reported harms.

Sample Size: 6253 user comments

Context: Digital mental health applications

Design Principle

Digital health interventions must be designed with a 'do no harm' principle, ensuring that automated assessments are presented with appropriate context, limitations, and clear pathways to human support.

How to Apply

When designing or evaluating mental health applications, rigorously assess the potential for negative user experiences stemming from diagnostic outputs and the availability of supportive resources.

Limitations

The study relies on self-reported data from user comments, which may be subjective and not representative of all users. The specific algorithms and design choices of the apps were not directly analyzed.

Student Guide (IB Design Technology)

Simple Explanation: Mental health apps can sometimes make people feel worse if they give a diagnosis without enough help or if they don't protect personal information properly.

Why This Matters: This research shows that even well-intentioned digital products can have unintended negative consequences for users, especially in sensitive areas like mental health. It's crucial to consider the user's emotional state and the ethical implications of your design.

Critical Thinking: To what extent should digital applications be responsible for providing mental health diagnoses versus directing users to qualified professionals?

IA-Ready Paragraph: The unintentional harms of digital mental health applications, as highlighted by Kang and Reynolds (2024), underscore the critical need for user-centered design that prioritizes ethical considerations and user safety. Their thematic analysis of user comments revealed that applications can cause distress through inaccurate diagnoses and a lack of adequate support, emphasizing that design choices in sensitive domains must extend beyond functionality to encompass emotional well-being and responsible guidance towards professional help.

Project Tips

How to Use in IA

Examiner Tips

Independent Variable: Use of depression self-management applications

Dependent Variable: Unintentional harms experienced by users (e.g., distress, uncertainty, privacy concerns)

Controlled Variables: Type of app store (Google Play, Apple App Store), specific app features (not explicitly controlled but implicitly varied across apps)

Strengths

Critical Questions

Extended Essay Application

Source

“This app said I had severe depression, and now I don’t know what to do”: the unintentional harms of mental health applications · 2024 · 10.1145/3613904.3642178