The Buzz this Week
Racial and ethnic minorities have long suffered from health inequities in the United States, and structural racism has been and remains the fundamental driver. The COVID-19 pandemic, along with broader social injustice issues that afflict this country, pushed health disparities into the spotlight, as discussed in a previous edition of Top Reads on bridging public health and healthcare to accelerate health equity.
This week marks HIMSS’ Global Health Equity Week, an effort to raise awareness and reform the global health ecosystem by harnessing the power of information and technology to eliminate health disparities. Racial bias pervades healthcare technologies, such as electronic health records (EHRs). Although the EHR is an integral way of communicating about patients, a recent Health Affairs study found racial bias in the language medical providers use to describe patients in medical records. Black patients were greater than 2.5 times more likely to have one or more negative descriptors in their EHR than white patients. Researchers wrote, “such bias has the potential to stigmatize Black patients and possibly compromise their care, raising concerns about systemic racism in healthcare.”
Artificial intelligence (AI) and algorithmic decision-making systems are used increasingly in healthcare, but the potential for racial biases arise as it can unfairly penalize certain segments of the population—particularly when representative data from these segments are not fully incorporated into AI and algorithm development. This can result in different care management recommendations for different people, even if those differences are not warranted. In an effort to combat this bias, California’s Attorney General recently launched an investigation into healthcare algorithmic software that is used to determine patient care to detect whether there are discriminatory impacts based on race and ethnicity.
In addition, research has found racial bias in medical devices that clinicians commonly use to evaluate patients, including pulse oximeters, infrared thermometers, and X-rays. This is largely because the devices were not tested on diverse populations prior to market launch. A recent study by Emory University found that forehead thermometers were significantly less accurate (26% lower) than oral thermometers in detecting fevers for Black patients. This can lead to delays in diagnosis or drug administration, and it could possibly even have fatal implications. In April 2022, the U.S. Food and Drug Administration (FDA) updated its latest draft guidance to improve diverse representation in clinical trials, including the requirement that device applications must report clinical trial demographic data, among other enhancements.
Why it Matters
While the Affordable Care Act (ACA) and expansion of Medicaid by select states has improved access to healthcare for some, inequities remain. Racial and ethnic minorities are less likely to have health insurance. And issues related to cost, accessibility, and access further contribute to inequities. In addition, research shows racial and ethnic minorities receive lower quality care, are less likely to receive routine healthcare, and therefore, have poorer health outcomes than white patients. Black patients are 2 to 3 times as likely to die of preventable heart disease and stroke than white patients. Black patients also have higher rates of cancer, asthma, pneumonia, and diabetes.
Patients of color are significantly less likely than white patients to have a provider who is the same race as they are—only 22% of Black adults reported racial concordance with their usual provider, compared to 74% of white adults. In addition, the representation of racial and ethnic minorities enrolled in medical school remains far below the respective percentages of the U.S. population.
To address racial bias in healthcare, medical institutions should be deliberate in their selection and use of medical devices and software so care disparities aren’t further exacerbated by imprecise tools. Institutions should also offer implicit bias training, cultural competency training, and education on race and racism. Medical institutions must also improve the diverse representation of providers, other health professionals, and faculty as this shared identity has the potential to improve health outcomes and strengthen patient-provider relationships. Technology solution vendors, health systems, education institutions, and policymakers must address these challenges related to racial bias, work toward resolving health disparities, and promote awareness to further advance health equity.
Algorithms Are Making Decisions About Health Care, Which May Only Worsen Medical Racism
Negative Patient Descriptors: Documenting Racial Bias in the Electronic Health Record
Study Finds Racial Bias in How Clinicians Describe Patients in Medical Records
Racially Biased Medical Device Results Raise Patient Safety Concerns
Racial, Ethnic, and Language Concordance Between Patients and Their Usual Health Care Providers
Editorial advisor: Roger Ray, MD, Chief Physician Executive.