Race Equality Foundation says UK review into medical device bias is not enough

Government review will examine the impact of devices such as oximeters on patients from different ethnic groups.
By Tammy Lovell
02:23 AM

Health and social care secretary Sajid Javid, said: “We urgently need to know more about the bias in these devices, and what impact it is having on the front line.”

Courtesy of Department of Health

The UK government has launched a “far-reaching review” into the impact of potential bias in the design and use of medical devices.

Race Equality Foundation CEO Jabeer Butt welcomed the review, but said it was not enough to explain the disproportionate deaths of Black, Asian and minority ethnic (BAME) people from COVID-19.

The review will examine medical devices currently on the market to identify where systematic bias and risk exist and make recommendations on how these issues should be tackled in the creation of medical devices from design to use. 

It will also consider the enhanced risk of bias in the emerging range of algorithmic based data / artificial intelligence (AI) tools.

Medical devices such as oximeters will be examined to identify potential discrepancies in how they work for different ethnic groups and whether regulations mean there is a systemic bias inherent in medical devices.

Some research has concluded darker skinned patients who might need hospitalisation are at greater risk of inaccurate results from oximeters due to a tendency to present higher levels of oxygen in their blood.

It will also examine MRI scanners, which are not recommended for use for pregnant or breastfeeding women.

Initial findings are expected by the end of January 2022.

WHY IT MATTERS

COVID-19 exposed health disparities across the country, as death rates have been higher among people from ethnic minorities.

UK regulations do not currently include provisions to ensure that medical devices are equally effective regardless of demographic factors, such as ethnicity. The review is intended to accelerate the process of improving the quality and availability of devices to diverse communities.

THE LARGER CONTEXT 

Concerns have previously been raised in the digital health space about the issue of race blind data. It was highlighted that the data protection impact assessment (DPIA) run by Palantir on the NHS COVID-19 data store would not be broken down by ethnicity, despite BAME people being disproportionately affected by the virus.

In January, NHS England committed to publishing ethnicity data on who received the COVID-19 vaccine, following backlash and accusations of potential bias. 

ON THE RECORD

Health and social care secretary Sajid Javid, wrote in The Times: “Although we have very high standards for these technologies in this country — and people should keep coming forward for the treatment they need — we urgently need to know more about the bias in these devices, and what impact it is having on the front line.”

Race Equality Foundation CEO, Jabeer Butt OBE, said: "This review is welcome but is unlikely to explain the disproportionate deaths of Black, Asian and minority ethnic people, including health and care workers, during COVID. It certainly does not replace the need for an urgent public inquiry, to properly explore why the pandemic had such a devastating impact on some groups in Britain."

Want to get more stories like this one? Get daily news updates from Healthcare IT News.
Your subscription has been saved.
Something went wrong. Please try again.