Source

I see Google’s deal with Reddit is going just great…

  • Soyweiser@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    6 months ago

    I also wanted to post this post. But it is going to be very funny if it turns out that LLMs are partially very energy inefficient but very data efficient storage systems. Shannon would be pleased for us reaching the theoretical minimum of bits per char of words using AI.

    • sinedpick@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      6 months ago

      huh, I looked into the LLM for compression thing and I found this survey CW: PDF which on the second page has a figure that says there were over 30k publications on using transformers for compression in 2023. Shannon must be so proud.

      edit: never mind it’s just publications on transformers, not compression. My brain is leaking through my ears.