Basically: you ask for poems forever, and LLMs start regurgitating training data:

  • inspxtr@lemmy.world
    link
    fedilink
    English
    arrow-up
    3
    ·
    1 year ago

    Something like this, unless they know the root cause (I didn’t read the paper so not sure if they do), or something close to it, may still be exploitable.