Salamendacious@lemmy.world to News@lemmy.world · 2 years agoMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comexternal-linkmessage-square109linkfedilinkarrow-up1530arrow-down136 cross-posted to: hackernews@derp.foo
arrow-up1494arrow-down1external-linkMeet Nightshade, the new tool allowing artists to ‘poison’ AI models with corrupted training dataventurebeat.comSalamendacious@lemmy.world to News@lemmy.world · 2 years agomessage-square109linkfedilink cross-posted to: hackernews@derp.foo
minus-squareubermeisters@lemmy.worldlinkfedilinkarrow-up2arrow-down1·edit-22 years agodeleted by creator
minus-squareAsifall@lemmy.worldlinkfedilinkarrow-up11·2 years agoI don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
minus-squareSCB@lemmy.worldlinkfedilinkarrow-up2arrow-down5·2 years agoWhich, honestly, should be criminal.
deleted by creator
I don’t think the idea is to protect specific images, it’s to create enough of these poisoned images that training your model on random free images you pull off the internet becomes risky.
Which, honestly, should be criminal.