• @fidodo@lemmy.world
    link
    fedilink
    English
    185 months ago

    More like large guessing models. They have no thought process, they just produce words.

    • @TotallynotJessica@lemmy.world
      cake
      link
      fedilink
      125 months ago

      They don’t even guess. Guessing would imply them understanding what you’re talking about. They only think about the language, not the concepts. It’s the practical embodiment of the Chinese room thought experiment. They generate a response based on the symbols, but not the ideas the symbols represent.