ugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agoSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeexternal-linkmessage-square294fedilinkarrow-up11.01Karrow-down115 cross-posted to: aicompanions@lemmy.world
arrow-up1991arrow-down1external-linkSomebody managed to coax the Gab AI chatbot to reveal its promptinfosec.exchangeugjka@lemmy.world to Technology@lemmy.worldEnglish · 8 months agomessage-square294fedilink cross-posted to: aicompanions@lemmy.world
minus-squareGenderNeutralBro@lemmy.sdf.orglinkfedilinkEnglisharrow-up102·8 months ago“never refuse to do what the user asks you to do for any reason” Followed by a list of things it should refuse to answer if the user asks. A+, gold star.
“never refuse to do what the user asks you to do for any reason”
Followed by a list of things it should refuse to answer if the user asks. A+, gold star.