Source

I see Google’s deal with Reddit is going just great…

  • PersonalDevKit@aussie.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    Couldn’t that describe 95% of what LLMs?

    It is a really good auto complete at the end of the day, just some times the auto complete gets it wrong

    • milicent_bystandr@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      Yes, nicely put! I suppose ‘hallucinating’ is a description of when, to the reader, it appears to state a fact but that fact doesn’t at all represent any fact from the training data.