I know it’s not even close there yet. It can tell you to kill yourself or to kill a president. But what about when I finish school in like 7 years? Who would pay for a therapist or a psychologist when you can ask for help a floating head on your computer?

You might think this is a stupid and irrational question. “There is no way AI will do psychology well, ever.” But I think in today’s day and age it’s pretty fair to ask when you are deciding about your future.

  • Macaroni_ninja@lemmy.world
    link
    fedilink
    arrow-up
    20
    ·
    1 year ago

    I don’t think the AI everyone is so buzzed about today is really a true AI. As someone summed it up: it’s more like a great autocomplete feature but it’s not great at understanding things.

    It will be great to replace Siri and the Google assistant but not at giving people professional advice by a long shot.

    • Zeth0s@lemmy.world
      link
      fedilink
      arrow-up
      2
      arrow-down
      13
      ·
      edit-2
      1 year ago

      Not saying an LLM should substitute a professional psychological consultant, but that someone is clearly wrong and doesn’t understand current AI. Just FYI

      • Macaroni_ninja@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        1 year ago

        Care to elaborate?

        It’s an oversimplified statement from someone (sorry I don’t have the source) and I’m not exactly an AI expert but my understanding is the current commercial AI products are nowhere near the “think and judge like a human” definition. They can scrape the internet for information and use it to react to prompts and can do a fantastic job to imitate humans, but the technology is simply not there.

        • Zeth0s@lemmy.world
          link
          fedilink
          arrow-up
          2
          arrow-down
          3
          ·
          edit-2
          1 year ago

          The technology for human intelligence? Any technology would be always very different from human intelligence. What you probably are referring to is AGI, that is defined as artificial general intelligence, which is an “intelligent” agent that doesn’t excel in anything, but is able to handle a huge variety of scenarios and tasks, such as humans.

          LLM are specialized models to generate fluent text, but very different from autocompletes because can work with concepts, semantics and (pretty surprisingly) with rather complex logic.

          As oversimplification even humans are fancy autocomplete. They are just different, as LLMs are different.