• H_Interlinked@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    10 months ago

    I work in a clinical setting where some Doctors are trying an AI program for generating their clinical notes out of the casual conversation between them and the patient. It’s way off its mark for what we demand in quality. It requires significant editing from the healthcare provider, and if the note is very robust it quickly becomes more of a chore than modern voice transcription. Our review is not great so far.

    • kromem@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      1
      ·
      10 months ago

      That’s a terrible way to be using a LLM for generating clinical notes.

      Sounds more like trying to use a screwdriver to hammer in screws than an issue with the screwdriver itself.

      • H_Interlinked@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        10 months ago

        Right. It seemed like a reach when I first heard of it, but that’s how it’s advertised and the Hospital was sold on at least trying it out.

      • Eggyhead@kbin.social
        link
        fedilink
        arrow-up
        2
        ·
        10 months ago

        Sounds more like trying to use a screwdriver to hammer in screws

        This is what I think about AI being forced into many things these days. Feels more like an attempt to justify subscription plans than anything actually productive.