centered image

centered image

AI Can Predict People's Race From X-Ray Images, And Scientists Are Concerned

Discussion in 'Hospital' started by The Good Doctor, May 24, 2022.

  1. The Good Doctor

    The Good Doctor Golden Member

    Joined:
    Aug 12, 2020
    Messages:
    10,731
    Likes Received:
    6
    Trophy Points:
    11,995
    Gender:
    Female

    Deep learning models based on artificial intelligence can identify someone's race just from their X-rays, new research has revealed – something that would be impossible for a human doctor looking at the same images.

    The findings raise some troubling questions about the role of AI in medical diagnosis, assessment, and treatment: could racial bias be unintentionally applied by computer software when studying images like these?

    Having trained their AI using hundreds of thousands of existing X-ray images labeled with details of the patient's race, an international team of health researchers from the US, Canada, and Taiwan tested their system on X-ray images that the computer software hadn't seen before (and had no additional information about).

    The AI could predict the reported racial identity of the patient on these images with surprising accuracy, even when the scans were taken from people of the same age and the same sex. The system hit levels of 90 percent with some groups of images.

    [​IMG]

    "We aimed to conduct a comprehensive evaluation of the ability of AI to recognize a patient's racial identity from medical images," write the researchers in their published paper.

    "We show that standard AI deep learning models can be trained to predict race from medical images with high performance across multiple imaging modalities, which was sustained under external validation conditions."

    The research echoes the results of a previous study that found artificial intelligence scans of X-ray images were more likely to miss signs of illness in Black people. To stop that from happening, scientists need to understand why it's occurring in the first place.

    By its very nature, AI mimics human thinking to quickly spot patterns in data. Yet this also means it can unwittingly succumb to the same kinds of biases. Worse still, their complexity makes it hard to untangle the prejudices we've woven into them.

    Right now the scientists aren't sure why the AI system is so good at identifying race from images that don't contain such information, at least not on the surface. Even when limited information is provided, by removing clues on bone density for instance or focussing on a small part of the body, the models still performed surprisingly well at guessing the race reported in the file.

    It's possible that the system is finding signs of melanin, the pigment that gives skin its color, that are as yet unknown to science.

    "Our finding that AI can accurately predict self-reported race, even from corrupted, cropped, and noised medical images, often when clinical experts cannot, creates an enormous risk for all model deployments in medical imaging," write the researchers.

    The research adds to a growing pile of evidence that AI systems can often reflect the biases and prejudices of human beings, whether that's racism, sexism, or something else. Skewed training data can lead to skewed results, making them much less useful.

    That needs to be balanced against the powerful potential of artificial intelligence to get through much more data much more quickly than humans can, everywhere from disease detection techniques to climate change models.

    There remain a lot of unanswered questions from the study, but for now it's important to be aware of the potential for racial bias to show up in artificial intelligence systems – especially if we're going to hand more responsibility over to them in the future.

    "We need to take a pause," research scientist and physician Leo Anthony Celi from the Massachusetts Institute of Technology told the Boston Globe.

    "We cannot rush bringing the algorithms to hospitals and clinics until we're sure they're not making racist decisions or sexist decisions."

    Source
     

    Add Reply

Share This Page

<