nifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square218fedilinkarrow-up11.64Karrow-down123
arrow-up11.61Karrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 5 months agomessage-square218fedilink
minus-squaregamermanh@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up7·5 months agoBecause lies require intent to deceive, which the AI cannot have. They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description
Because lies require intent to deceive, which the AI cannot have.
They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description