Integrating Expert and User Feedback Uncovers Critical Usability Flaws in E-Prescription Systems
Category: User-Centred Design · Effect: Strong effect · Year: 2025
Combining heuristic evaluations by design experts with direct user feedback through usability testing reveals more comprehensive insights into system flaws than either method alone.
Design Takeaway
Always validate design decisions through both expert critique and direct user interaction to ensure a robust and user-friendly final product, especially in critical systems like e-prescriptions.
Why It Matters
E-prescription systems are critical for healthcare efficiency and patient safety. Identifying and rectifying usability issues is paramount to prevent errors, improve user satisfaction, and optimize operational workflows within healthcare settings.
Key Finding
While experts noted specific areas for improvement in error prevention and user control for both systems, users found the SSEPS to be marginally more usable than the HIEPS. The study emphasizes that using both expert and user perspectives provides a more complete picture of a system's usability.
Key Findings
- Both systems exhibited significant room for improvement in error prevention and user control.
- HIEPS showed better internal consistency according to expert review.
- SSEPS received a slightly higher average usability score (70.73) compared to HIEPS (69.21) from users, a statistically significant difference.
- A combined approach of expert and user evaluation is more effective than single-method evaluations.
Research Evidence
Aim: To comprehensively evaluate the usability of electronic prescription systems by integrating expert heuristic reviews and user-based testing to identify areas for improvement.
Method: Mixed-methods approach combining heuristic evaluation and usability testing.
Procedure: Three experts conducted heuristic evaluations of two e-prescription systems using Nielsen's principles, rating issues on a severity scale. Fifty users then completed usability testing using a translated System Usability Scale (SUS). Expert and user feedback were statistically compared across both systems.
Sample Size: 53 participants (3 experts, 50 users)
Context: Healthcare technology, specifically electronic prescription systems.
Design Principle
Employ a multi-faceted evaluation strategy that integrates expert heuristic analysis with empirical user testing to achieve a holistic understanding of system usability.
How to Apply
When designing or evaluating complex digital systems, conduct heuristic evaluations with experienced designers and follow up with usability testing involving representative end-users.
Limitations
The study focused on two specific e-prescription systems; findings may not generalize to all such systems. The user sample was specific to the context of the study.
Student Guide (IB Design Technology)
Simple Explanation: To make sure a digital tool is easy to use, get feedback from both people who know a lot about design (experts) and the people who will actually use the tool (users). This helps find problems that one group might miss.
Why This Matters: Understanding how users interact with a system and where they encounter difficulties is crucial for creating effective and safe designs, particularly in fields like healthcare where errors can have serious consequences.
Critical Thinking: How might the severity ratings assigned by experts differ from the actual impact of those issues on end-users, and how can this discrepancy be addressed in future evaluations?
IA-Ready Paragraph: This research highlights the efficacy of integrating expert heuristic evaluations with user-based usability testing for a comprehensive assessment of system usability. By employing both perspectives, design practitioners can uncover a broader range of issues, leading to more robust and user-centered design solutions, as demonstrated by the comparative analysis of electronic prescription systems.
Project Tips
- When evaluating a design, consider using a combination of methods like expert reviews and user testing.
- Clearly document the criteria used by experts and the tasks given to users during testing.
How to Use in IA
- Reference this study when justifying the use of mixed methods for evaluating design solutions, particularly for complex digital interfaces.
Examiner Tips
- Ensure that the justification for chosen evaluation methods is clearly linked to the design problem and target users.
Independent Variable: ["Type of evaluation method (heuristic vs. user testing)","E-prescription system (SSEPS vs. HIEPS)"]
Dependent Variable: ["Severity of usability issues","Perceived usability score (SUS)"]
Controlled Variables: ["Number of experts","Number of users","Heuristic principles used","Severity rating scale"]
Strengths
- Employs a mixed-methods approach for a more thorough evaluation.
- Utilizes established evaluation frameworks (Nielsen's Heuristics, SUS).
Critical Questions
- To what extent do expert heuristics predict actual user difficulties?
- How can the findings from this study be applied to the design of other complex digital interfaces beyond healthcare?
Extended Essay Application
- This research can inform an Extended Essay investigating the usability of a specific digital product by employing a similar mixed-methods approach to gather and analyze data.
Source
Comprehensive usability evaluation of electronic prescription systems: integrating expert and user perspectives · BMC Medical Informatics and Decision Making · 2025 · 10.1186/s12911-025-03308-w