AI Skin Cancer Detection: Is It Fair for Everyone?

Imagine a Mole Checker in Your Pocket

Think about snapping a photo of a mole with your phone and finding out right away if it might be skin cancer. That’s the promise of new artificial‑intelligence tools. They can analyze thousands of images faster than any doctor and spot patterns humans might miss. In a Stanford Medicine study, healthcare workers who used AI alongside their own judgment diagnosed skin cancer more accurately than those working without it. Even experienced dermatologists improved a bit, while nurses and family doctors saw the biggest jump. Early detection saves lives, so a tool that helps catch cancer sooner seems like a win for everyone. Truly everyone? Let’s dive in!

But here’s the catch: will these apps and algorithms work just as well for people with darker skin? Many of the images these systems learned from show light‑skinned patients. That imbalance raises big questions about fairness and effectiveness. Let’s explore what’s going on.

Data Gaps: When AI Doesn’t See Dark Skin

AI learns by example. If it sees lots of examples of one type of skin, it gets better at recognizing that type. Unfortunately, most publicly available skin‑lesion databases are loaded with images of light skin. A 2021 analysis of 21 databases found that only 2,436 out of 106,950 images had skin‑tone information. Of those, ten images were from people recorded as having brown skin and just one image showed a person with dark brown or black skin. Researchers also noted that none of the images with ethnicity data came from individuals of African, African‑Caribbean or South Asian backgrounds.

It’s not just the training databases. Medical textbooks and lecture slides aren’t much better. A Stanford study used a machine‑learning tool to scan thousands of images in dermatology training materials and found that only about one in ten images showed brown or black skin. If future doctors rarely see how diseases look on darker skin, the AI they help train won’t see it either.

Hidden Bias in the Building Process

Lack of diverse data isn’t the only problem. Bias can creep in at every stage of developing an AI model. First, the images themselves: when researchers choose photos to train a model, they might select “clear” or “textbook” examples that tend to come from light‑skinned patients in clinic settings. Second, the people labeling these images may be less familiar with how conditions appear on darker skin. Third, the final model may never be tested on a diverse group before it’s published.

These hidden biases show up in performance. In a recent review of AI tools for dermatology, researchers looked at program developed over a decade and found that they almost always performed worse on skin of color. In fact, only about 30 % of the program they examined included any data specifically from people with darker skin tones. When a system rarely sees examples of melanoma on brown or black skin, it’s no surprise it misses most of those cases. That’s not just an academic problem; it’s a potentially life‑threatening one.

Photo Quality Matters Too

Another piece of the puzzle is the way images are captured. A mole photographed under bright clinic lights with a special camera looks very different from a photo taken at home on a phone. Lighting, focus and exposure can blur important details, especially on darker skin. In the past, cameras were literally calibrated for lighter complexions, and while modern devices are more inclusive, automatic settings can still struggle with very light or very dark subjects. If the AI was trained mostly on clear, high‑quality clinic images, it may not work well on grainy phone photos, which often come from patients of color.

Researchers have also observed selection bias: “good” images i.e. sharp, close‑up pictures showing textbook examples of a lesion, are more likely to be chosen for model training, while “messier” photos, which might include many from darker‑skinned patients, are left out. This means the model isn’t exposed to the variety of real‑world images it will encounter.

What Happens If We Do Nothing?

Ignoring these gaps has real consequences. Skin cancer is common and catching it early makes a big difference. Regular skin checks can spot potential problems before they become dangerous, and early treatment is more successful. For serious melanoma, early detection can mean a 98 % five‑year survival rate. People of color already tend to be diagnosed later and have worse outcomes than people with light skin. If AI tools reinforce those disparities by missing cancers on dark skin or giving false reassurance, they could delay diagnosis even further.

There’s a trust issue too. If communities see that these tools don’t work well for them, they may mistrust not only AI but healthcare more broadly. On the flip side, if doctors rely too heavily on AI and don’t recognize its blind spots, they might overlook concerning lesions on brown or black patients. Instead of closing the gap in care, we risk widening it.

Looking Forward – A Path Toward Fair and Effective AI

The good news is that these problems are fixable. Here are a few ways forward:

  • Diversify the data: The most straightforward solution is to gather more images of skin conditions on darker skin tones. Initiatives like Stanford’s Skin Tone Analysis project and Google’s Skin Condition Image Network are working to create datasets that better reflect the full spectrum of skin colors.
  • Be transparent: Developers should clearly report what kinds of images are used to train and test their models. This makes it easier to spot gaps and biases before a tool is widely released.
  • Include diverse teams: Having people from different backgrounds involved in building and testing AI can help catch blind spots. If your team includes dermatologists and patients of color, you’re more likely to notice when the model fails on those populations.
  • Improve image collection: Encourage patients to submit photos with good lighting and provide simple instructions on how to capture clear images at home. Training AI on a wide range of photo quality, from professional to smartphone photos, will make it more robust.
  • Educate clinicians and patients: Doctors need to be aware of how skin cancer looks on all skin tones, and patients should know that melanoma can affect anyone. Debunking myths like “skin cancer only happens to fair‑skinned people” helps everyone take skin health seriously.

Final Thoughts

AI has incredible potential to make skin cancer screening faster and more accessible, especially in places where dermatologists are scarce. Early studies show that doctors, nurse practitioners and medical students all improve their diagnostic accuracy when AI assists. That’s worth celebrating. But these tools are only as good as the data and design behind them. If we let them mirror old biases, we risk leaving some patients behind.

Fairness isn’t a nice‑to‑have; it’s essential. Building AI that works for everyone means taking a hard look at whose skin the algorithms “see,” how images are gathered and labeled, and who gets a seat at the table during development. With thoughtful action now, we can ensure that the mole checker in your pocket works equally well for all skin tones and that AI’s promise truly benefits everyone.

References

Nicola Davis, “AI skin cancer diagnoses risk being less accurate for dark skin – study,” The Guardian (November 9, 2021).

Andrew Myers, “AI Shows Dermatology Educational Materials Often Lack Darker Skin Tones,” Stanford Human‑Centered AI (September 5, 2023).

News‑Medical.net, “Current AI programs do worse at identifying skin lesions in people of color, research shows” (March 7, 2024).

Stanford Medicine News Center, “AI improves accuracy of skin cancer diagnoses in Stanford Medicine-led study” (April 11, 2024).

Center for Dermatology, “Why Annual Skin Cancer Screenings Are Essential for Early Detection”.


Discover more from Health Viewpoints

Subscribe to get the latest posts sent to your email.

Discover more from Health Viewpoints

Subscribe now to keep reading and get access to the full archive.

Continue reading