• BetaDoggo_@lemmy.world
    link
    fedilink
    English
    arrow-up
    159
    arrow-down
    18
    ·
    10 months ago

    Nobody cares until someone rich is impacted. Revenge porn has been circulating on platforms uninhibited for many years, but the second it happens to a major celebrity suddenly there’s a rush to do something about it.

    • givesomefucks@lemmy.world
      link
      fedilink
      English
      arrow-up
      89
      arrow-down
      6
      ·
      10 months ago

      What?

      This isn’t revenge porn, it’s fakes of celebrities.

      Something that was done for decades, and one of the biggest parts of early reddit. So it’s not “the second” either.

      The only thing that’s changed is people are generating it with AI.

      The ones made without AI (that have been made for decades) are a lot more realistic and a lot more explicit. It just takes skill and time, which is why people were only doing it for celebrities.

      The danger of AI is any random person could take some pictures off social media and make explicit images. The technology isn’t there yet, but it won’t take much longer

    • Ð Greıt Þu̇mpkin@lemm.ee
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      1
      ·
      10 months ago

      I think it’s more about the abject danger that unregulated AI replication of noteworthy figures represents to basically everything

      Also, revenge porn is illegal in I think every state but South Carolina and even then it might have been banned since I saw that stat

    • Mango@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      2
      ·
      10 months ago

      You think it wasn’t celebrities first? The issue here is specifically Taylor Swift.

    • Fades@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      11
      ·
      10 months ago

      What a braindead take. Both the US and many other intl countries have enacted AI safety and regulation rules, this is an extension of that effort. The idea is to set a precedent for this kind of behavior. They are also looking into how AI is being used for election interference like having AI Biden tell people not to vote.

      Everybody cares, just because it’s not all in place day 0 doesn’t mean nobody does

  • Aniki 🌱🌿@lemm.ee
    link
    fedilink
    English
    arrow-up
    67
    arrow-down
    17
    ·
    edit-2
    10 months ago

    This wasn’t a problem until the rich white girl got it. Now we must do… something. Let’s try panic!

    -The Whitehouse, probably.

    • frickineh@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      arrow-down
      10
      ·
      10 months ago

      Honestly, I kind of don’t even care. If that’s what it takes to get people to realize that it’s a serious problem, cool. I mean, it’s aggravating, but at least now something might actually happen that helps protect people who aren’t megastars.

        • awwwyissss@lemm.ee
          link
          fedilink
          English
          arrow-up
          8
          arrow-down
          18
          ·
          10 months ago

          Blah blah blah so tiring to hear this thoughtless perspective constantly pushed in the fediverse.

          • PapaStevesy@midwest.social
            link
            fedilink
            English
            arrow-up
            10
            ·
            edit-2
            10 months ago

            I’m just saying, nothing about this should lead anyone to the conclusion that anyone in power is suddenly going to start caring about poor people. They’re literally only talking about this because a billionaire got its feelings hurt.

            • awwwyissss@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Fair enough, and at the end of the day I probably hate billionaires as much as you do.

          • eskimofry@lemmy.world
            link
            fedilink
            English
            arrow-up
            5
            arrow-down
            4
            ·
            10 months ago

            If you don’t like discourse that is different from your beliefs then plug your ears and shout lalala as you have been doing so for decades.

            • intensely_human@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              10 months ago

              Someone complaining about the same thoughtless perspective is not complaining about discourse.

              Just once I’d love to have actual discourse about capitalism. I’ve never met a person who expressed hatred of capitalism who seemed capable of of discourse unfortunately.

      • intensely_human@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        3
        ·
        10 months ago

        The only thing that could possibly happen to protect people from this is to make AI illegal. That would be (a) impossible to enforce without draconian decrease in individual freedom, like keeping people stuffed in crates of packing foam instead of free to move around, and (b) absolutely horrible if it were successfully enforced.

        AI is cheaper and easier to proliferate than any drug. We have not succeeded in controlling drugs, despite their physical requirements of mass and volume making them visible in reality, a feature AI does not share.

        The attempt to control AI can and will destroy all our freedoms if we let it. Again, the only way to control something so ephemeral as computation is to massively restrict all freedom.

    • Fades@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      6
      ·
      10 months ago

      It absolutely was a problem before and it’s not because Taylor is white. Revenge porn laws aren’t new and AI legislation has been in the works before this popped off.

      You also gonna say nobody cared about election interference until an AI recording of Biden told people not to vote?

      Just because you weren’t aware doesn’t mean it wasn’t happening

  • Zozano@lemy.lol
    link
    fedilink
    English
    arrow-up
    34
    arrow-down
    2
    ·
    10 months ago

    Do you want more AI gens of nude Taylor Swift? Because that’s how you get more AI gens of nude Taylor Swift.

    • remotelove@lemmy.ca
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      8
      ·
      edit-2
      10 months ago

      Well, it’s not really just about Swift. There are probably many other people that are going through this. Not every person who generates nudes of someone else is going to make it to the news, after all.

      I could see this being a problem in highschools as really mean pranks. That is not good. There are a million other ways I could see fake nudes being used against someone.

      If someone spread pictures of me naked: 1. I would be flattered and 2. Really ask why someone wants to see me naked in the first place.

      If anything, just an extension of any slander(?) laws would work. It’s going to be extremely hard to enforce any law though, so there is that.

      However, how long have revenge porn laws been a thing? Were they ever really a thing?

  • Bonesy91@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    15
    ·
    10 months ago

    This is what the white house is concerned about… Fuck them. Like there is so much worse going on in America but oh no one person has ai fake porn images heaven forbid!

    • MirthfulAlembic@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      7
      ·
      10 months ago

      The White House is capable of having a position on more than one issue at a time. There also doesn’t seem to be a particular bill they are touting, so this seems to be more of a “This is messed up. Congress should do something about it” situation than “We’re dropping everything to deal with this” one.

      • go_go_gadget@lemmy.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        6
        ·
        10 months ago

        The White House is capable of having a position on more than one issue at a time.

        Doubt.

    • XeroxCool@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      7
      ·
      10 months ago

      Nice job reading the article, any one of these articles, to actually get context and not just react to headlines.

      People are asking about Swift. The government isn’t buddying up to her specifically. Swift is only the most famous face of this issue with very focused growth on this.

  • cosmicrookie@lemmy.world
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    7
    ·
    10 months ago

    Wait… They want to stop only Taylor Swift AI fakes? Not every AI fake representing a real person???

    • AngryCommieKender@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      4
      ·
      10 months ago

      Only AI fakes of billionaires. They’re just admitting that there’s a two tiered legal system, and if you’re below a certain “value,” you will not be protected.

    • ehrik@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      4
      ·
      10 months ago

      Y’all need to read the article and stop rage baiting. It’s literally a click away.

      “Legislation needs to be passed to protect people from fake sexual images generated by AI, the White House said this afternoon.”

  • CALIGVLA@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    16
    ·
    edit-2
    10 months ago

    U.S. government be like:

    Thousands of deep fakes of poor people: I sleep.

    Some deep fakes of some privileged Hollywood elite: R E A L S H I T.

  • thantik@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    arrow-down
    6
    ·
    10 months ago

    I’d much rather that we do nothing, let it proliferate to the point where nobody trusts nudes at all any more.

      • thantik@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        You really need healthier relationships in your life I think; my wife would have no reason to do such a thing.

      • thantik@lemmy.world
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        6
        ·
        edit-2
        10 months ago

        That’s perfect. It should be legal. Making pornography of someone illegal is just a different scale of grey from say…making drawing muhammad illegal, etc.

        I can already hire an artist to make me some porn of …I dunno…Obama or something. Why should that be illegal just because someone does it with AI instead?

          • thantik@lemmy.world
            link
            fedilink
            English
            arrow-up
            19
            arrow-down
            3
            ·
            edit-2
            10 months ago

            Hate to break it to you, this is already legal. “Non Consensual Porn” only applies to photographs. Nobody should have to consent to everything like that.

            If I draw you standing under the eiffel tower, fully clothed - the legality shouldn’t change just because you don’t LIKE what’s being drawn.

            • Эшли Карамель@discuss.tchncs.de
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              8
              ·
              10 months ago

              I’m aware it’s already legal, hence why action should be taken. plus videos are just a bunch of photos stitched together so I don’t see your point of it only applying to photos.

              • thantik@lemmy.world
                link
                fedilink
                English
                arrow-up
                11
                arrow-down
                4
                ·
                edit-2
                10 months ago

                Because it being nude/etc is the only thing that is different from people just simply drawing others in art.

                Just because you don’t like pornography, shouldn’t change the legality of it. It’s prudism and puritanism at its finest.

                • Эшли Карамель@discuss.tchncs.de
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  arrow-down
                  9
                  ·
                  10 months ago

                  it’s not porn in general that should be illegal. ONLY pornography where the person has not explicitly said they would like to be in it. such as deepfake porn, or drawn where the person has also not said they would like to be in it.

            • Ð Greıt Þu̇mpkin@lemm.ee
              link
              fedilink
              English
              arrow-up
              4
              arrow-down
              13
              ·
              10 months ago

              Nobody should have to consent to everything like that

              I’m sorry but holy fuck that is just morally bankrupt.

              Someone should have the ABSOLUTE right to control any distribution of their image when of a sexual nature that they didn’t actively consent to being out there

              Anything less is the facilitation of the culture of sexual abuse that lets the fappening or age of consent countdown clocks happen

              Drawing a picture of someone under the eifel tower is a wildly different act than drawing them in the nude without them knowing and agreeing with full knowledge of what you plan to do with that nude piece.

              • Fal@yiffit.net
                link
                fedilink
                English
                arrow-up
                11
                arrow-down
                4
                ·
                10 months ago

                Calling this sexual abuse is absolutely insulting and disgusting

                • Ð Greıt Þu̇mpkin@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  10
                  ·
                  10 months ago

                  Trying to pretend it’s not is feeding the culture of not listening to victims.

                  It’s like saying that cat calling is harmless, forcing people to be reminded they are seen as a sex object is well known and documented as a tool of keeping the victim “in their place.”

                  It’s harassment, and when done at the scale famous folks experience for the crime of being well known and also attractive, basically amounts to a campaign of terror via sexual objectification.

                  Nevermind how tolerating it makes space for even more focused acts of terror like doxxing and making threats of sexual assault.

          • intensely_human@lemm.ee
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            1
            ·
            10 months ago

            Having an image exist somewhere of them isn’t the sort of thing a person should have to consent to.

            Consent is for things that affect that person.

  • iheartneopets@lemm.ee
    link
    fedilink
    English
    arrow-up
    24
    arrow-down
    8
    ·
    10 months ago

    Taylor is just trying to distract us from her jet emissions again, just like her new PR relationship with that Kelce guy was almost certainly to distract us from her dating that Matty Healy dude that openly said he enjoys porn that brutalizes black women (and also from her jet emissions).

    She’s not stupid. She’s a billionaire very aware of how news cycles work.

  • EmperorHenry@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    3
    ·
    edit-2
    10 months ago

    Notice how there wasn’t any of this kind of effort to stop AI when pedos were making CP with it? Or when indie artists were being plagiarized by it?

  • THEDAEMON@lemmy.ml
    link
    fedilink
    English
    arrow-up
    20
    arrow-down
    18
    ·
    10 months ago

    The amount of people in this thread that doesn’t see this as a problem is disturbing.

    • AdmiralShat@programming.dev
      link
      fedilink
      English
      arrow-up
      13
      ·
      edit-2
      10 months ago

      I think the main issue many people are taking is that these fakes have been around for a bit, but NOW there’s a call to legislation when it’s a billionaire that’s the victim.

      Of course it’s a problem and I’ve said before that this needs to be discussed on a legislative level, but even I’m rolling my eyes that it took a literal billionaire being exposed to it to have any impact.

        • wahming@monyet.cc
          link
          fedilink
          English
          arrow-up
          10
          ·
          10 months ago

          Thing is, people have been making these content for decades, and are unlikely to stop anytime soon, regardless of what legislation says. It would be far more beneficial if we could change society’s attitude towards these issues instead

            • wahming@monyet.cc
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              1
              ·
              10 months ago

              We’ve successfully changed our attitudes to things time and time again. Take LGBT as the obvious related example

              • THEDAEMON@lemmy.ml
                link
                fedilink
                English
                arrow-up
                1
                arrow-down
                4
                ·
                10 months ago

                But homophobes still exists so if some people like that exist one of them can make 100 of those pictures in 1 minute

                • wahming@monyet.cc
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  10 months ago

                  The point is that if we as a society start normalising sexuality and stop shaming people, there would be no incentive for these pictures to exist. Like with pot and alcohol, prohibition does not solve the issue, but arguably make it worse

    • intensely_human@lemm.ee
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      1
      ·
      10 months ago

      And the amount of people who don’t see massive new legislation as a problem is also disturbing.

    • Fedizen@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      10 months ago

      I think it was Deloitte that is expecting that AI scams (such as fake emergency call scams) will cost the economy trillions of dollars. Taylor swift fake nudes is just the tip of an iceberg of problems created by new tech. Its just a matter of time before the public internet is completely unusable.

      • intensely_human@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        It’s happening already. I think AI is a massive, massive problem. I knew it would be a massive problem twenty years ago thinking about it as a stoner talking the future of AI.

        I just don’t think legislation is going to help.

    • Pratai@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      8
      ·
      10 months ago

      I was actually downvoted to oblivion because I called out some kids for suggesting she’s not as much of a victim as others because she’s wealthy.

      Fucking gross. Who’s raising these kids?

  • Nomecks@lemmy.ca
    link
    fedilink
    English
    arrow-up
    11
    arrow-down
    12
    ·
    10 months ago

    Dear world, please stop making fakes kthx.

    We need to change society so people don’t want to make celebrity deep fakes.

    • PapaStevesy@midwest.social
      link
      fedilink
      English
      arrow-up
      10
      ·
      10 months ago

      What do you propose? Keep in mind, we can’t even change society so people don’t constantly try to kill each other over nothing.

      • intensely_human@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        10 months ago

        That’s not just society. If you look at nature you’ll notice every organism has weapons. Evolution doesn’t brook wasted energy, and yet every organism spends a big portion of its energy budget building weapons. Think about that.

        • Emotional_Sandwich@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          That’s not true. Most organisms spend a big portion of their energy on moving, eating, or just existing. Humans are one of the few organisms capable of creating a weapons.

        • PapaStevesy@midwest.social
          link
          fedilink
          English
          arrow-up
          1
          ·
          10 months ago

          There’s a huge difference between protecting yourself from harm and creating weapons powerful enough to destroy all life on the planet. We’re in the second group, no other species even comes close. And all that being said, the drive to procreate is even stronger than the drive to kill your neighbor, so good luck conditioning that out of society.

    • Semperverus@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      10 months ago

      I’m open to proposals on how to make people willingly not want to do these kinds of things of their own volition.

      • Match!!@pawb.social
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        14
        ·
        10 months ago

        make ais publicly owned and have ai prompts be a matter of public record

        • Эшли Карамель@discuss.tchncs.de
          link
          fedilink
          English
          arrow-up
          10
          ·
          10 months ago

          impossible to enforce and that’s if all ai was open. secondly, corporations and companies will never be willing to give out their million dollar AI’s for free to the public.

        • intensely_human@lemm.ee
          link
          fedilink
          English
          arrow-up
          2
          ·
          10 months ago

          What you really mean there is eliminate all privately-owned AI. Sort of like eliminating all the privately owned drugs, except instead of physical objects you’re targeting software.

          So, what do you suppose the budget would be of the office whose task it is to eliminate privately-held AI?

    • intensely_human@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      10 months ago

      Yes that’s the ticket. If our laws are unenforceable, simply mold humanity until they aren’t.

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      13
      ·
      10 months ago

      Nah we just need to punish the offenders. Send these kids to juvie for 12 mos for first offense and give it a sexual assault tag.