• ugo@feddit.it
      link
      fedilink
      English
      arrow-up
      18
      arrow-down
      1
      ·
      1 month ago

      But the article author wasn’t interfacing with chatgpt, she was interfacing with a human paid to help with the things the article author did not know. The wedding planner was a supposed expert in this interaction, but instead simply sent back regurgitated chatgpt slop.

      Is this the fault of the wedding planner? Yes. Is it the fault of chatgpt? Also yes.

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        19
        ·
        1 month ago

        Scams are LLM’s best use case.

        They’re not capable of actual intelligence or providing anything that would remotely mislead a subject matter expert. You’re not going to convince a skilled software developer that your LLM slop is competent code.

        But they’re damn good at looking the part to convince people who don’t know the subject that they’re real.

    • Pandemanium@lemm.ee
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 month ago

      I think we should require professionals to disclose whether or not they use AI.

      Imagine you’re an author and you pay an editor $3000 and all they do is run your manuscript through ChatGPT. One, they didn’t provide any value because you could have done the same thing for free; and two, if they didn’t disclose the use of AI, you wouldnt even know your novel had been fed into one and might be used by the AI for training.

      • bitofhope@awful.systems
        link
        fedilink
        English
        arrow-up
        15
        ·
        1 month ago

        I think we should require professionals not to use the thing currently termed AI.

        Or if you think it’s unreasonable to ask them not to contribute to a frivolous and destructive fad or don’t think the environmental or social impacts are bad enough to implement a ban like this, at least maybe we should require professionals not to use LLMs for technical information