misk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agoJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.comexternal-linkmessage-square75fedilinkarrow-up1479arrow-down117cross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
arrow-up1462arrow-down1external-linkJailbroken AI Chatbots Can Jailbreak Other Chatbotswww.scientificamerican.commisk@sopuli.xyz to Technology@lemmy.worldEnglish · 1 year agomessage-square75fedilinkcross-posted to: hackernews@lemmy.smeargle.fanshackernews@derp.foo
minus-squareKairuByte@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up6·1 year agoWhat possible legitimate reason could someone need to know how to make chlorine/mustard gas? Apart from the fact that they are made from common household products, are easy to make by mistake, and can kill you. Wait that’s true of napalm as well… fuck.
What possible legitimate reason could someone need to know how to make chlorine/mustard gas?
Apart from the fact that they are made from common household products, are easy to make by mistake, and can kill you.
Wait that’s true of napalm as well… fuck.