• SpikesOtherDog@ani.social
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    it makes things up, it makes no attempt to adhere to reason when it’s making an argument.

    It doesn’t hardly understand logic. I’m using it to generate content and it continuously will assert information in ways that don’t make sense, relate things that aren’t connected, and forget facts that don’t flow into the response.

    • mayonaise_met@feddit.nl
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      1 year ago

      As I understand it as a layman who uses GPT4 quite a lot to generate code and formulas, it doesn’t understand logic at all. Afaik, there is currently no rational process which considers whether what it’s about to say makes sense and is correct.

      It just sort of bullshits it’s way to an answer based on whether words seem likely according to its model.

      That’s why you can point it in the right direction and it will sometimes appear to apply reasoning and correct itself. But you can just as easily point it in the wrong direction and it will do that just as confidently too.