alb_004 to Technology@lemmy.worldEnglish · 6 months agoChatGPT provides false information about people, and OpenAI can’t correct itnoyb.euexternal-linkmessage-square61fedilinkarrow-up1209arrow-down111cross-posted to: goodnews@lemmy.mltechnology@lemmy.worldaicompanions@lemmy.worldfuck_ai@lemmy.worldnews@lemmy.world
arrow-up1198arrow-down1external-linkChatGPT provides false information about people, and OpenAI can’t correct itnoyb.eualb_004 to Technology@lemmy.worldEnglish · 6 months agomessage-square61fedilinkcross-posted to: goodnews@lemmy.mltechnology@lemmy.worldaicompanions@lemmy.worldfuck_ai@lemmy.worldnews@lemmy.world
minus-squarevithigar@lemmy.calinkfedilinkEnglisharrow-up2arrow-down2·6 months agoRight, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.
Right, so keep personal data out of the training set and use it only in the easily readable and editable context. It’ll still “hallucinate” details about people if you ask it for details about people, but those people are fictitious.