• Pons_Aelius@kbin.social
    link
    fedilink
    arrow-up
    3
    ·
    1 year ago

    Quote:

    Still prone to hallucinations

    Since Folax is inherently drawing responses from ChatGPT, it can often hallucinate and present incorrect answers — often very confidently. Once again, the only way to remedy this is to upgrade to newer models, such as GPT-4 (or equivalent), which have fewer hallucinations and more accurate responses.

    Counterpoint: Moving to GPT-4 makes it harder to realise when the reply is complete bullshit.

    • ink@r.nf
      link
      fedilink
      English
      arrow-up
      1
      ·
      1 year ago

      This is why ChatGPT needs to provide sources and references. but since it scraped things indiscriminately, that’ll lead them to legal trouble. There’s a services like perplexity[.]ai that uses internet search plugin for ChatGPT? and lists sources. much better if you want to check the validity of the things it spits out