mozz@mbin.grits.dev to Technology@beehaw.org · 7 months agoSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devimagemessage-square200fedilinkarrow-up1489arrow-down14file-text
arrow-up1485arrow-down1imageSomeone got Gab's AI chatbot to show its instructionsmbin.grits.devmozz@mbin.grits.dev to Technology@beehaw.org · 7 months agomessage-square200fedilinkfile-text
minus-squarePup Biru@aussie.zonelinkfedilinkarrow-up14·edit-27 months agoanyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
minus-squareIcalasari@fedia.iolinkfedilinkarrow-up9·edit-27 months agoOh I wasn’t saying that I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)
anyone who enables a company whose “values” lead to prompts like this doesn’t get to use the (invalid) “just following orders” defence
Oh I wasn’t saying that
I was saying the person may not be stupid, and may figure their boss is a moron (the prompts don’t work as LLM chat bots don’t grasp negatives in their prompts very well)