nifty@lemmy.world to Technology@lemmy.worldEnglish · 6 months agoGoogle AI making up recalls that didn’t happenlemmy.worldimagemessage-square53fedilinkarrow-up1920arrow-down19
arrow-up1911arrow-down1imageGoogle AI making up recalls that didn’t happenlemmy.worldnifty@lemmy.world to Technology@lemmy.worldEnglish · 6 months agomessage-square53fedilink
minus-squaregamermanh@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up4·6 months agoBecause lies require intent to deceive, which the AI cannot have. They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description
Because lies require intent to deceive, which the AI cannot have.
They merely predict the most likely thing that should next be said, so “hallucinations” is a fairly accurate description