• Pasta Dental@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    65
    arrow-down
    1
    ·
    4 months ago

    Ill believe it when I see it: an LLM is basically a random box, you can’t 100% patch it. Their only way for it to stop generating bomb recipes is to remove that data from the training