basiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 3 months agoAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.comexternal-linkmessage-square48fedilinkarrow-up1376arrow-down11file-textcross-posted to: technology@beehaw.org
arrow-up1375arrow-down1external-linkAfter using ChatGPT, man swaps his salt for sodium bromide—and suffers psychosisarstechnica.combasiclemmon98@lemmy.dbzer0.com to Not The Onion@lemmy.worldEnglish · 3 months agomessage-square48fedilinkfile-textcross-posted to: technology@beehaw.org
minus-squareprole@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up9arrow-down1·3 months agoYeah, people like to mock us about it, but I think it’s a reasonable regulation.
minus-squareTuukka R@sopuli.xyzlinkfedilinkEnglisharrow-up4·3 months ago Yeah, people like to mock us about it, but I think it’s a reasonable regulation. It should apparently be amended, though. There is a known case that it accidentally forbids but should not forbid.
minus-squareTuukka R@sopuli.xyzlinkfedilinkEnglisharrow-up0arrow-down1·3 months agodeleted by creator
Yeah, people like to mock us about it, but I think it’s a reasonable regulation.
It should apparently be amended, though. There is a known case that it accidentally forbids but should not forbid.
deleted by creator