Voice Assistant Errors Amplify Racial Bias, Impacting User Psychology

Category: User-Centred Design · Effect: Strong effect · Year: 2023

Disparities in voice assistant speech recognition accuracy across racial groups can lead to negative psychological outcomes for users from marginalized racial backgrounds.

Design Takeaway

Prioritize equitable performance in voice assistant ASR systems across all racial demographics to ensure a positive and unbiased user experience.

Why It Matters

This research highlights a critical, yet often overlooked, aspect of user experience: the psychological impact of algorithmic bias. Designers must consider how system failures, particularly those with racial disparities, affect user emotions, self-perception, and trust in technology.

Key Finding

When voice assistants make more errors for Black users, these users experience increased self-consciousness, lower self-esteem, and more negative feelings towards the technology, unlike white users who do not show these effects.

Key Findings

Research Evidence

Aim: To investigate whether speech recognition errors in voice assistants can elicit similar negative psychological effects as misunderstandings in cross-racial interpersonal communication.

Method: Controlled Experiment

Procedure: Participants (Black and white) were assigned to interact with a voice assistant programmed with either a high or low error rate. Psychological responses and technology ratings were then measured.

Sample Size: 108 participants

Context: Human-computer interaction, voice assistant technology, cross-racial communication.

Design Principle

Design for equitable performance: Ensure that technology functions reliably and without negative psychological impact for all user groups, regardless of race.

How to Apply

When developing or evaluating voice-enabled products, conduct rigorous testing with diverse user groups to identify and quantify performance differences in ASR. Implement bias mitigation strategies in ASR model training and deployment.

Limitations

The study focused on specific types of ASR errors and may not generalize to all forms of technological bias or all user demographics. The artificial nature of the programmed error rates might differ from real-world, naturally occurring errors.

Student Guide (IB Design Technology)

Simple Explanation: Voice assistants sometimes make more mistakes for Black users than for white users. This study found that when this happens, Black users feel more self-conscious, less confident, and dislike the technology more, while white users don't have the same negative feelings.

Why This Matters: This research shows that technology isn't always neutral. Design choices can unintentionally create unfair experiences and harm users' feelings and self-worth, especially for minority groups.

Critical Thinking: To what extent can designers be held responsible for the biases embedded within third-party AI components like ASR, and what ethical frameworks should guide their decision-making in such cases?

IA-Ready Paragraph: This research by Wenzel et al. (2023) demonstrates that disparities in voice assistant speech recognition accuracy can lead to significant negative psychological effects, such as increased self-consciousness and lower self-esteem, for Black users. This underscores the critical need for designers to ensure equitable performance across diverse user groups to avoid creating biased and harmful user experiences.

Project Tips

How to Use in IA

Examiner Tips

Independent Variable: Error rate of the voice assistant (high vs. low), Participant race (Black vs. white).

Dependent Variable: Self-consciousness, Self-esteem, Positive affect, Technology ratings.

Controlled Variables: Type of voice assistant interaction, Pre-programmed error patterns.

Strengths

Critical Questions

Extended Essay Application

Source

Can Voice Assistants Be Microaggressors? Cross-Race Psychological Responses to Failures of Automatic Speech Recognition · 2023 · 10.1145/3544548.3581357