Skip to Main Content

LONDON — Regulators, clinicians, and health care algorithm developers need to take additional steps to ensure that medical devices work equally well for all patients, avoiding blind spots that can lead to worse care for patients from underrepresented racial and ethnic groups, according to a U.K.-commissioned review released on Monday.

The report, which also warned about the products’ biases potentially hurting women and people from lower socioeconomic groups, called on the government to improve its understanding of the devices used commonly in the country’s health service, with a need for an expert panel that can assess the possible unintended consequences as AI tools expand. Providers also need to learn about the limitations of these devices, to ensure they don’t result in poor patient care.

advertisement

The report cited several examples, including evidence that pulse oximeters, which track blood oxygen levels, can overestimate such levels in patients with darker skin, potentially providing reassurance instead of indicating they need to be treated.

STAT+ Exclusive Story

STAT+

This article is exclusive to STAT+ subscribers

Unlock this article — and get additional analysis of the technologies disrupting health care — by subscribing to STAT+.

Already have an account? Log in

Monthly

$39

Totals $468 per year

$39/month Get Started

Totals $468 per year

Starter

$30

for 3 months, then $399/year

$30 for 3 months Get Started

Then $399/year

Annual

$399

Save 15%

$399/year Get Started

Save 15%

11+ Users

Custom

Savings start at 25%!

Request A Quote Request A Quote

Savings start at 25%!

2-10 Users

$300

Annually per user

$300/year Get Started

$300 Annually per user

View All Plans

To read the rest of this story subscribe to STAT+.

Subscribe

To submit a correction request, please visit our Contact Us page.