Pro@programming.devM to Technology@programming.devEnglish · edit-211 hours agoComet AI browser can get prompt injected from any site, drain your bank accounti.imgur.comimagemessage-square6fedilinkarrow-up142arrow-down10file-textcross-posted to: technology@beehaw.org
arrow-up142arrow-down1imageComet AI browser can get prompt injected from any site, drain your bank accounti.imgur.comPro@programming.devM to Technology@programming.devEnglish · edit-211 hours agomessage-square6fedilinkfile-textcross-posted to: technology@beehaw.org
minus-squareThekingoflorda@lemmy.worldlinkfedilinkEnglisharrow-up11·1 day agoHow did they not think about this? This is a very basic prompt injection, and it still falls for it.
minus-squareTonyTonyChopper@mander.xyzlinkfedilinkEnglisharrow-up15·24 hours agoThey probably asked AI to write the browser. AI loves writing code with security vulnerabilities
minus-squareNatanael@infosec.publinkfedilinkEnglisharrow-up7·23 hours agoThe whole attack model has been known for years already and it isn’t even the first time that specifically an LLM browser plugin has been exploited by page contents https://bsky.app/profile/natanael.bsky.social/post/3kr2ud66y2x24
minus-squarecriss_cross@lemmy.worldlinkfedilinkEnglisharrow-up2·21 hours agoWhy think when there’s VC money to be had?
How did they not think about this? This is a very basic prompt injection, and it still falls for it.
They probably asked AI to write the browser. AI loves writing code with security vulnerabilities
The whole attack model has been known for years already and it isn’t even the first time that specifically an LLM browser plugin has been exploited by page contents
https://bsky.app/profile/natanael.bsky.social/post/3kr2ud66y2x24
Why think when there’s VC money to be had?