This is just a draft, best refrain from linking. (I hope we’ll get this up tomorrow or Monday. edit: probably this week? edit 2: it’s up!!) The [bracketed] stuff is links to cites.

Please critique!


A vision came to us in a dream — and certainly not from any nameable person — on the current state of the venture capital fueled AI and machine learning industry. We asked around and several in the field concurred.

AIs are famous for “hallucinating” made-up answers with wrong facts. The hallucinations are not decreasing. In fact, the hallucinations are getting worse.

If you know how large language models work, you will understand that all output from a LLM is a “hallucination” — it’s generated from the latent space and the training data. But if your input contains mostly facts, then the output has a better chance of not being nonsense.

Unfortunately, the VC-funded AI industry runs on the promise of replacing humans with a very large shell script. If the output is just generated nonsense, that’s a problem. There is a slight panic among AI company leadership about this.

Even more unfortunately, the AI industry has run out of untainted training data. So they’re seriously considering doing the stupidest thing possible: training AIs on the output of other AIs. This is already known to make the models collapse into gibberish. [WSJ, archive]

There is enough money floating around in tech VC to fuel this nonsense for another couple of years — there are hundreds of billions of dollars (family offices, sovereign wealth funds) desperate to find an investment. If ever there was an argument for swingeing taxation followed by massive government spending programs, this would be it.

Ed Zitron gives it three more quarters (nine months). The gossip concurs with Ed on this being likely to last for another three quarters. There should be at least one more wave of massive overhiring. [Ed Zitron]

The current workaround is to hire fresh Ph.Ds to fix the hallucinations and try to underpay them on the promise of future wealth. If you have a degree with machine learning in it, gouge them for every penny you can while the gouging is good.

AI is holding up the S&P 500. This means that when the AI VC bubble pops, tech will drop. Whenever the NASDAQ catches a cold, bitcoin catches COVID — so expect crypto to go through the floor in turn.

  • froztbyte@awful.systems
    link
    fedilink
    arrow-up
    0
    ·
    8 months ago

    The one major comment from me is I think you have a framing weakness centered on “if you know how LLMs work”, since that is a very load bearing point for much of what follows but holds no anchor point for those who do not know to also follow along (unless going on faith)

    I realize you’re not writing this as an explainer blog, but a short sentence + link to elsewhere with an explanation (“for those who don’t know, <link> has a decent explanation without being overly technical”) might be a good patch for that?

    Is it worth also casting a light on the various vendor deals with e.g. Reddit (et al), in their search for structured and scoped training data?

    • David Gerard@awful.systemsOP
      link
      fedilink
      arrow-up
      0
      ·
      8 months ago

      yeah, it’s the balance between “as you know” (when a given post will always be someone’s first) and explaining the universe from first principles

      • froztbyte@awful.systems
        link
        fedilink
        arrow-up
        0
        ·
        8 months ago

        Yep, know you that. To be clear, my suggestion was more to have the extra outside ref link, not to suggest more work. Apologies, flubrain the last few days kicking my ass