Sixth0795@sh.itjust.works to Science Memes@mander.xyzEnglish · 6 months agoHuhsh.itjust.worksimagemessage-square43fedilinkarrow-up11arrow-down10
arrow-up11arrow-down1imageHuhsh.itjust.worksSixth0795@sh.itjust.works to Science Memes@mander.xyzEnglish · 6 months agomessage-square43fedilink
minus-squareLimeey@lemmy.worldlinkfedilinkEnglisharrow-up0·6 months agoIt all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.
It all comes down to the fact that LLMs are not AGI - they have no clue what they’re saying or why or to whom. They have no concept of “context” and as a result have no ability to “know” if they’re giving right info or just hallucinating.