Apple Watch wasn't built for dark skin like mine. We deserve tech that works for everyone.

On Christmas morning, my little sister handed me a gift box adorned with a bright red bow. As the crumbled wrapping paper fell to the ground, I discovered a sleek, brand-new Apple Watch. This wasn't just a stylish accessory but a thoughtful gesture from my baby sister, who has always rooted for me.

She knew I was eager to revive my passion for running after a tough year of losing our father amid the COVID-19 pandemic. I was ready to reconnect with the simple joy of my feet hitting the pavement. Little did I know that wearable fitness and health technology devices like the Apple Watch were not built for people with darker skin tones such as mine.

Millions of people are now navigating the chaotic journey of shopping for the holidays, spending billions of dollars in just a few weeks. Among many enticing choices, the Apple Watch and other wearable tech devices seem like an excellent present, offering a blend of style and functionality. Wouldn't it be the perfect gift to kick off the new year? Well, hold that thought.

The dilemma is that technology fails people of color every day. Research shows many high-tech gadgets deliver inaccurate readings, particularly for individuals with darker skin tones. Seemingly neutral devices such as soap dispensers, automatic hand sanitizer stations, camera recognition software, heart rate monitors and self-driving cars fail to accurately recognize darker skin tones. These technologies literally “do not see color.”

Technology fails people of color

Pulse oximeters are placed on the fingertip and use a light beam to measure how much oxygen is traveling in blood, an important metric for many medical conditions. However, pulse oximeters often use biased data and algorithms (computerized instructions), and several studies show pulse oximeters are less accurate for Black patients than white patients. False estimates generated by clinical tools and equations increase the chances of medical errors or mistreatment.

But it doesn’t stop here.

Ashley Judd: We have the power to help women and girls caught in crises. Why won't we?

The thermometer, a household essential, is less accurate as well. Forehead thermometers? Yes, you read that correctly. The very tool parents, guardians and day care centers across the United States rely upon every day could deliver less accurate readings.

So, what do we do if the medical technology used to provide care is not just faulty but racially biased?

The thermometer, a household essential, is less accurate as well for people of color.
The thermometer, a household essential, is less accurate as well for people of color.

When an individual learns that a medical device is inaccurate, there are few legal options. Companies like 23andMe have been sued several times for inaccurate estimates and information breaches. However, since learning about inaccurate pulse oximeter estimates in 2021, the Food and Drug Administration has failed to meaningfully address race-related inaccuracies.

Lawsuits have surfaced, accusing certain devices of "racial bias," citing harm and misdiagnosis. While some pulse oximeters have been improved to include additional wavelengths in the light beam, they are not widely used or standardized across hospital systems.

After years of waiting, how many more people must suffer?

I'm worn from years of racial slurs. But I'll no longer be silent about bigotry.

Racial bias, stereotyping and discrimination have resulted in differential treatment in many fields. Black children are less likely to receive pain medication. Black men are less likely to receive standard cardiac procedures. Black women are more likely to die during childbirth.

This conversation is deeper; it's about colorism, medical mistrust and how skin pigmentation has profound consequences in American society. Individuals with darker skin tones face harsher punishments and receive longer prison sentences.

Racial bias is deeper than just bad tech

Consumers play a crucial role in advocating for equitable devices. Individuals invest money and trust in the products they choose, paying good money for hot-ticket items. The wearable technology market is a multibillion dollar industry.

Apple states, “Blood Oxygen app measurements are not intended for medical use ... and are only designed for general fitness and wellness purposes.” However, this language burdens the consumer, who is left to decide what this means and how to use their watch.

Apple states, “Blood Oxygen app measurements are not intended for medical use, including self-diagnosis or consultation with a doctor, and are only designed for general fitness and wellness purposes.”
Apple states, “Blood Oxygen app measurements are not intended for medical use, including self-diagnosis or consultation with a doctor, and are only designed for general fitness and wellness purposes.”

We can drive positive change within the tech industry by voicing concerns and demanding inclusivity. This is essential for curating innovation that considers diverse needs and ensures technology is accurate, unbiased and fair for everyone, regardless of skin tone.

Many people are working to improve the racial biases built into everyday technology. Ellis Monk, a Harvard sociologist, is exploring ways to expand the color palette in technologies such as cameras and color filters. His scale, the Monk Skin Tone, includes more shades that more accurately reflect society. Joy Buolamwini, and activist groups such as the Algorithmic Justice League, demand algorithm improvements in facial recognition software and cameras.

Opinion alerts: Get columns from your favorite columnists + expert analysis on top issues, delivered straight to your device through the USA TODAY app. Don't have the app? Download it for free from your app store.

The harms of biased technology are clear, and health care institutions are responsible for mitigating the consequences. The FDA must speed up its timeline, invest in interdisciplinary research to improve pulse oximeters, update policies and provide a clear plan of action for providers.

The FDA has requested clearer labels and more testing, but we need technology that is calibrated to view skin in all shades now.

We must educate engineers and designers on inclusive strategies and designs. Addressing the accuracy of pulse oximeters is not only a clinical issue, it's also a national emergency and moral obligation.

Dr. Marie Plaisime is a medical sociologist at Harvard University School of Public Health. Her expertise is in racial bias training in medical education and clinical practice, race-based medicine, algorithmic bias, and health policy.
Dr. Marie Plaisime is a medical sociologist at Harvard University School of Public Health. Her expertise is in racial bias training in medical education and clinical practice, race-based medicine, algorithmic bias, and health policy.

Whether you’re a tech-savvy giant or a budding fitness enthusiast, one thing is certain: Racial bias harms everyone. Whether you are shopping for the perfect holiday gift, rummaging through your medicine cabinet to find the thermometer or getting a pulse ox reading for your patient, we need reliable technology.

Imagine the peace of mind that comes with knowing that your watch or thermometer is precise. We owe justice to patients who receive care every day.

As I set up my Apple Watch to pick my next run, I think to myself, “We deserve quality products. After all, we do pay for them.”

Marie Plaisime is a medical sociologist at Harvard University School of Public Health. Her expertise is in racial bias training in medical education and clinical practice, race-based medicine, algorithmic bias and health policy.

You can read diverse opinions from our Board of Contributors and other writers on the Opinion front page, on Twitter @usatodayopinion and in our daily Opinion newsletter.

This article originally appeared on USA TODAY: Apple Watch doesn't work on BIPOC. Bias runs deeper than bad tech