• Phen
    link
    fedilink
    arrow-up
    5
    ·
    29 days ago

    My guess is the ones that have prostate cancer.

    • ikt@aussie.zone
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      29 days ago

      it says in the article

      The test uses AI to study images of tumours and pick out features invisible to the human eye. The team, funded by Prostate Cancer UK, the Medical Research Council and Artera, trialled the test on biopsy images from more than 1,000 men with high-risk prostate cancer that had not spread.

      The AI test identified the 25% of men in the group most likely to benefit from the abiraterone – for these men, the drug halves the risk of death.

      In the trial, patients received a score – biomarker-positive or -negative – which was compared with their outcomes. For those with biomarker-positive tumours, one in four of the men, abiraterone cut their risk of death after five years from 17% to 9%.

      For those with biomarker-negative tumours, abiraterone cut the risk of death from 7% to 4% – a difference that was not statistically or clinically significant, the team said. These men would benefit from standard therapy alone and be spared unnecessary treatment.

  • PennyRoyal@sh.itjust.works
    link
    fedilink
    arrow-up
    4
    ·
    29 days ago

    We need more enlightening terms for this stuff. The hallucinatory waffling waste of energy isn’t AI, it’s all LLMs. The models which work well in medicine aren’t generally LLMs, and aren’t even known as Generative AI either, they’re very advanced data aggregation systems. If we keep calling everything AI, it all gets erroneously lumped together and tarred with the same brush, held in a badly drawn hand with too many fingers. The data science used in medical images is a world away from LLM nonsense, and has been making decent progress for at least a decade - here are some of the more amazing uses

    • I’ve been wanting the same for years because even the rudimentary movement and behaviour patterns of an NPC in a video game is called “AI” and it’s not really AI either, in the big sense that it is an intelligence that was artificially created. While true, it is artificial and mimicks behaviour of something intelligent, it’s not actually thinking at all and once you know the rules they follow, you can exploit them. Hell, a lot of games that is actually the intended thing to do and they wouldn’t work at all if the NPCs had a real AI governing them.

    • BrikoX@lemmy.zipOPM
      link
      fedilink
      English
      arrow-up
      3
      ·
      29 days ago

      <…> they’re very advanced data aggregation systems. If we keep calling everything AI, it all gets erroneously lumped together and tarred with the same brush, held in a badly drawn hand with too many fingers.

      Also know as machine learning. But that term was killed by the industry on purpose to exploit the “AI” hype bubble.

    • Fleur_@aussie.zone
      link
      fedilink
      arrow-up
      1
      ·
      29 days ago

      I think for non tech people (can’t write a line of code in any language) the processes are similar enough for the use of a general term. I think AI is fine for this purpose even ignoring the fact that it’s probably too late to go back

    • Opinionhaver@feddit.uk
      link
      fedilink
      English
      arrow-up
      1
      ·
      29 days ago

      The term artificial intelligence is broader than many people realize. It doesn’t mean human-level consciousness or sci-fi-style general intelligence - that’s a specific subset called AGI (Artificial General Intelligence). In reality, AI refers to any system designed to perform tasks that would typically require human intelligence. That includes everything from playing chess to recognizing patterns, translating languages, or generating text.

      Large language models fall well within this definition. They’re narrow AIs - highly specialized, not general - but still part of the broader AI category. When people say “this isn’t real AI,” they’re often working from a fictional or futuristic idea of what AI should be, rather than how the term has actually been used in computer science for decades.

  • notaviking@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    29 days ago

    Me: ChatGPT, will this prostate drug work for me?

    ChatGPT: According to multiple articles on the web, your chosen drug might be beneficial. But it is best to consult with a health professional. Here are several in your area.

    Me: My god, it will work…