• Tartas1995@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    2 days ago

    “ai bad” is obviously stupid.

    Current LLM bad is very true. The method used to create is immoral, and are arguably illegal. In fact, some of the ai companies push to make what they did clearly illegal. How convenient…

    And I hope you understand that using the LLM locally consuming the same amount as gaming is completely missing the point, right? The training and the required on-going training is what makes it so wasteful. That is like saying eating bananas in the winter in Sweden is not generating that much CO2 because the distance to the supermarket is not that far.

    • daniskarma@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      edit-2
      2 days ago

      I don’t believe in Intelectual Property. I’m actually very against it.

      But if you believe in it for some reason there are models exclusively trained with open data. Spanish government recently released a model called ALIA, it was 100% done with open data, none of the data used for it was proprietary.

      Training energy consumption is not a problem because it’s made so sparsely. It’s like complaining about animation movies because rendering takes months using a lot of power. It’s an irrational argument. I don’t buy it.

      • Tartas1995@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        2 days ago

        I am not necessarily got intellectual property but as long as they want to have IPs on their shit, they should respect everyone else’s. That is what is immoral.

        How is it made sparsely? The training time for e.g. chatgtp 4 was 4 months. Chatgtp 3.5 was released in November 2023, chatgtp 4 was released in March 2024. How many months are between that? Oh look at that… They train their ai 24/7. For chatgtp 4 training, they consumed 7200MWh. The average American household consumes a little less than 11000kWh per year. They consumed in 1/3 of the time, 654 times the energy of the average American household. So in a year, they consume around 2000 times the electricity of an average American household. That is just training. And that is just electricity. We don’t even talk about the water. We are also ignoring that they are scaling up. So if they would which they didn’t, use the same resources to train their next models.

        Edit: sidenote, in 2024, chatgtp was projected to use 226.8 GWh.

        • daniskarma@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          edit-2
          2 days ago

          2000 times, given your approximations as correct, the usage of a household for something that’s used by millions, or potentially billions, of people it’s not bad at all.

          Probably comparable with 3d movies or many other industrial computer uses, like search indexers.

          • Tartas1995@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            4
            ·
            edit-2
            2 days ago

            Yeah, but then they start “gaming”…

            I just edited my comment, just no wonder you missed it.

            In 2024, chatgtp was projected to use 226.8 GWh. You see, if people are “gaming” 24/7, it is quite wasteful.

            Edit: just in case, it isn’t obvious. The hardware needs to be produced. The data collected. And they are scaling up. So my point was that even if you do locally sometimes a little bit of LLM, there is more energy consumed then just the energy used for that 1 prompt.