return2ozma@lemmy.world to Technology@lemmy.worldEnglish · 17 hours agoChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comexternal-linkmessage-square26fedilinkarrow-up1136arrow-down16
arrow-up1130arrow-down1external-linkChatGPT safety systems can be bypassed to get weapons instructionswww.nbcnews.comreturn2ozma@lemmy.world to Technology@lemmy.worldEnglish · 17 hours agomessage-square26fedilink
minus-squareFreedomAdvocate@lemmy.net.aulinkfedilinkEnglisharrow-up5·13 hours agoYou don’t even need an LLM, just an internet connected browser.
minus-squareEcho Dot@feddit.uklinkfedilinkEnglisharrow-up1·3 hours agoOr literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.
You don’t even need an LLM, just an internet connected browser.
Or literally just buy some fertiliser. We’ve all seen what happens when some ammonium nitrate catches fire, if you have enough of it in one place it’s practically a nuclear bomb level detonation.