• Lydia_K@startrek.website
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 months ago

    LLMs don’t make decisions or understand things at all, they just regurgitate text in a human like manner.

    I say this as someone who sees a lot of potential in the technology, though, but like this, or like most people are claiming we can use them.