The Basque Country is implementing Quantus Skin in its health clinics after an investment of 1.6 million euros. Specialists criticise the artificial intelligence developed by the Asisa subsidiary due to its “poor” and “dangerous” results. The algorithm has been trained only with data from white patients.

  • BassTurd@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    22 hours ago

    My only real counter to this is who created the dataset and did the people that were creating the app have any power to affect that? To me, to say something is racist implies intent, where this situation could be that, but it could also be a case where it’s just not racially diverse, which doesn’t necessarily imply racism.

    There’s a plethora of reasons that the dataset may be mostly fair skinned. To prattle off a couple that come to mind (all of this may be known, idk, these are ignorant possibilities on my side) perhaps more fair skinned people are susceptible so there’s more data, like you mentioned that dark skinned individuals may have less options to get medical help, or maybe the dataset came from a region with not many dark skinned patients. Again, all ignorant speculation on my part, but I would say that none of those options inherently make the model racist, just not a good model. Maybe racist actions led to a bad dataset, but if that’s out of the devs control, then I wouldn’t personally put that negative on the model.

    Also, my interpretation of what racist means may differ, so there’s that too. Or it could have all been done intentionally in which case, yea racist 100%

    Edit: I actually read the article. It sounds like they used public datasets that did have mostly Caucasian people. They also acknowledged that fair skinned people are significantly more likely to get melanoma, which does give some credence to the unbalanced dataset. It’s still not ideal, but I would also say that maybe nobody should put all of their eggs in an AI screening tool, especially for something like cancer.

    • xorollo@leminal.space
      link
      fedilink
      English
      arrow-up
      10
      ·
      19 hours ago

      There is a more specific word for it: Institutional racism.

      Institutional racism, also known as systemic racism, is a form of institutional discrimination based on race or ethnic group and can include policies and practices that exist throughout a whole society or organization that result in and support a continued unfair advantage to some people and unfair or harmful treatment of others. It manifests as discrimination in areas such as criminal justice, employment, housing, healthcare, education and political representation.[1]

    • WanderingThoughts@europe.pub
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      21 hours ago

      My only real counter to this is who created the dataset and did the people that were creating the app have any power to affect that?

      A lot of AI research in general was first done by largely Caucasian students, so the datasets they used skewed that way, and other projects very often started from those initial datasets. The historical reason there are more students of that skin tone is because they have in general the most money to finance the schooling, and that’s because past racism held African-American families back from accumulating wealth and accessing education, and that still affects their finances and chances today, assuming there is no racism still going on in scholarships and accepting students these days.

      Not saying this is specifically happening for this project, just a lot of AI projects in general. It causes issues with facial recognition in lots of apps for example.

      • BassTurd@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        21 hours ago

        They did touch on the facial recognition aspect as well. My main thing is, does that make the model racist if the source data is diverse? I’d argue that it’s not, although racist decisions may have lead to a poor dataset.