I’m rather curious to see how the EU’s privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn’t have a paywall)

  • stealthnerd@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    2
    ·
    1 year ago

    This is an article about unlearning data, not about not consuming it in the first place.

    LLM’s are not storing learned data in it’s raw, original form. They are injesting it and building an understanding of language based off of it.

    Attempting to peel out that knowledge would be incredibly difficult, if not impossible because there’s really no way to identify it.

    • Eccitaze@yiffit.net
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      2
      ·
      1 year ago

      And we’re saying that if peeling out knowledge that someone has a right to have forgotten is difficult or impossible, that knowledge should not have been used to begin with. If enforcement means big tech companies have to throw out models because they used personal information without knowledge or consent, boo fucking hoo, let me find a Lilliputian to build a violin for me to play.

      • stealthnerd@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Okay I get it but that’s a different argument. Starting fresh only gets you so far. Once am LLM exists and is exposed to the public users can submit any data they like and the LLM has no idea the source.

        You could argue then that these models shouldn’t be able to use user submitted data but that would be a devastating restriction to the technology and that starts to become a question of whatever we want this tech to exist at all.

      • LittleLordLimerick@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        arrow-down
        1
        ·
        1 year ago

        If enforcement means big tech companies have to throw out models because they used personal information without knowledge or consent, boo fucking hoo

        A) this article isn’t about a big tech company, it’s about an academic researcher. B) he had consent to use the data when he trained the model. The participants later revoked their consent to have their data used.