An independent review has highlighted significant concerns about biases in medical devices affecting ethnic minorities and women in the UK, underscoring the need for urgent reforms. The review found that medical devices like pulse oximeters, widely used during the Covid-19 pandemic, showed less accuracy for individuals with darker skin tones. This inaccuracy in measuring oxygen levels could lead to misdiagnosis or delays in treatment for black patients and other ethnic minorities. Additionally, the review pointed out that artificial intelligence (AI)-based medical devices could show biases, resulting in the underestimation of conditions such as skin cancer in people with darker skin tones and other diseases in women.

Chaired by Prof Dame Margaret Whitehead, the review made 18 recommendations to address these issues, all of which have been accepted by the government. These recommendations call for system-wide action, including the removal of biases in datasets used to develop and test medical devices, and improving training for healthcare professionals to ensure equitable healthcare access for all, regardless of ethnic background or gender.

The Department of Health and Social Care has committed to taking steps to mitigate these disparities, emphasizing the urgency of implementing the recommendations. The goal is to prevent biases in the development and utilization of medical technologies, ensuring patient safety, inclusivity, and equitable healthcare practices for every individual.