• lepinkainen@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    3 days ago

    The don’t “ChatGPT”, you can use local models and they cost next to nothing on Meta’s or Google’s scale. Both run their own servers for it.

    • stebo@sopuli.xyz
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      3 days ago

      of course they don’t use chatgpt and whatever they use isn’t comparable to chatgpt cuz that would be unsustainable

      • lepinkainen@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        3 days ago

        “Isn’t comparable”? For generic tasks that’s true.

        Figuring out shittily censored words from pictures and subtitles? The custom models are even better