The biggest challenge to getting an agreement over the European Union’s proposed AI Act has come from France, Germany and Italy, who favour letting makers of generativeAI models self-regulate instead of having hard rules.

Well, we saw what happened (allegedly) with OpenAI “self-regulating” itself.

  • frog 🐸
    link
    fedilink
    English
    193 months ago

    Can anyone name any sector that, when left to self-regulate, has actually behaved in a responsible and constructive manner? Any company that did the right thing in the absence of regulation telling them to?

      • frog 🐸
        link
        fedilink
        English
        7
        edit-2
        3 months ago

        Maybe it’s different in the US, but in my country, age ratings on media aren’t the sector regulating itself. Film and TV are rated by an NGO (who issue ratings based on a list of criteria determined by government legislation), and games are rated by an organisation that is accountable to the government. So I’d consider both to be externally regulated, not self-regulated.

          • frog 🐸
            link
            fedilink
            English
            63 months ago

            I suspect that the games industry has managed to self-regulate its own ratings in large part because TV and film ratings are so often codified in law. It provides a baseline for what is and isn’t acceptable for certain audiences, and makes it obvious that regulation will happen one way or another. The existing TV and films ratings systems also create an environment where the consumer expects something similar for games. Regulation of visual entertainment has basically been normalised for the entire lifetime of most people alive today.

            I think it also explains why the games industry has been bad at self-regulating on gambling stuff like lootboxes/etc. When a game has graphic violence or sex, it’s easy to draw a comparison with film and TV, and pre-emptively self-regulate. The gaming industry can manage that because everybody involved is familiar with film and TV - and may even have worked in that industry before, since there are many skill overlaps. But the organisations and institutes doing the ratings would seem less familiar with the gambling industry, and therefore haven’t given enough thought to how they ought to self-regulate on that. There’s a sufficient lack of self-regulation on lootboxes/etc that external regulation appears to be necessary.

            And I think this ultimately highlights why AI will need external regulation. The only sector that has successfully self-regulated is one that already had a base of comparison with a separate-but-similar sector that has an existing history of regulation. AI doesn’t have anything comparable to use as a baseline. While game devs could look at the history of film and TV regulation and make a good guess as to what would happen if they didn’t regulate themselves, the AI devs think they’re untouchable.

    • @Even_Adder@lemmy.dbzer0.com
      link
      fedilink
      English
      13 months ago

      I don’t think we can let the current big AI players regulate themselves, but the ESRB hasn’t been too bad at doing its job.

  • ultratiem
    link
    fedilink
    103 months ago

    We let Google self regulate. We let Facebook self regulate. We let Microsoft self regulate. How did that turn out again??

    • BeboOP
      link
      English
      23 months ago

      When profits are of concern, “self-regulation” gets defenestrated!

    • 4dpuzzle
      link
      fedilink
      English
      13 months ago

      At this point, I think that regulations are useless. Not because these companies aren’t harmful. But because they will either convince the government that they’ll self-regulate, or they’ll use their insane profits to bribe the politicians into castrating the regulatory agencies. I’m convinced that the only way to prevent these greedy scum from harming humanity is to never let them grow that big in the first place. When these companies are big enough to control the government, they should be cut down to size with a healthy margin of safety.