• kristina [she/her]@hexbear.net
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    6 days ago

    machine learning is different from “ai/llms/generative ai”, doesnt help the company who makes it is calling it ai either

  • Le_Wokisme [they/them, undecided]@hexbear.net
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    6 days ago

    mentions false positives but no idea of the rate

    doesn’t address confounding variables, like that time it was just a correlation with older MRI machines and poorer areas, so maybe there’s regional differences in how the technicians hold the wands or something.

    doesn’t say anything about what kind of machine learning was used and “AI” by itself is a meaningless term thanks to the grifting slop mongers.

    • fanbois [he/him]@hexbear.net
      link
      fedilink
      English
      arrow-up
      8
      ·
      6 days ago

      Valid concerns, but image recognition is absolutely something neural networks are good at. Finding a pattern of pixels in a set of greyscale images that point to a “cancerous growth” vector and then highlighting it in a diagnostic program seems like a very appropriate task.

      It’s a simple but very large data set, it’s a very specific task and it doesn’t require 100% deterministic truth, because doctors really don’t do anything else but “that bit here looks a bit darker, we need to check that”. It’s a good application of a powerful technology.

      As long as you don’t make it write the report, give it the power to administer or deny biopsy, integrate a chat bot, need a data center the size of belgium and fire your oncology department, that it.