• froztbyte@awful.systems
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    that looks like someone used win9x mspaint to make a flag, fucked it up, and then fucked it up even more on the saving throw

  • HeckGazer@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    Oh ez, that’s only 17 orders of magnitude!

    If we managed an optimistic pace of doubling every year that’d only take… 40 years. The last few survivors on desert world can ask it if it was worth it

    • Eiim@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Rather amusing prediction that despite the obscene amount of resources being spent on AI compute already, it’s apparently reasonable to expect to spend 1,000,000x that in the “near future”.

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      You don’t understand, after we invent god AGI all our problems are solved. Now step into the computroniuminator, we need your atoms for more compute.

    • someacnt_@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Yeah I don’t see why people are so blind on this. Computation is energy-intensive, and we are yet to optimize it for the energy. Yet, all the hopes…

      • DivineDev@kbin.run
        link
        fedilink
        arrow-up
        0
        ·
        7 months ago

        We do optimize, it’s just that when you decrease the energy for computations by half, you just do twice the computations to iterate faster instead of using half the energy.

  • DumbAceDragon@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    7 months ago

    What these people don’t realize is you’re never gonna get AGI by just feeding a machine an infinite amount of raw data.

      • David Gerard@awful.systemsM
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        There might actually be nothing bad about the Torment Nexus, and the classic sci-fi novel “Don’t Create The Torment Nexus” was nonsense. We shouldn’t be making policy decisions based off of that.

        wild

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        Yes, we know (there are papers about it) that for LLMs every increase of capabilities we need exponentially more data to train it. But don’t worry, we only consumed half the worlds data to train LLMs, still a lot of places to go ;).

    • Naz@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      Interesting. I recall a phenomenon by which inorganic matter was given a series of criterion and it adapted based on changes from said environment, eventually forming data which it then learned from over a period of millions of years.

      It then used that information to build the world wide web in the lifetime of a single organism and cast doubt on others trying to emulate it.

      But I see your point.

        • voracitude@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          7 months ago

          Sorry, I don’t necessarily agree with the other person, and the formation of organic compounds doesn’t apply here anyway, but what would you call the sensory inputs that our brains filter and interpret?

          • zbyte64@awful.systems
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            7 months ago

            My dog does linear algebra everytime he pees on a fire hydrant so that he only pees for the exact amount of time needed. Similarly, when I drain my bath tub, it acts as a linear algebra machine that calculates how long it takes for the water to drain through a small hole.

            Is this a fun way to look at the world that allows us to more readily build computational devices from our environment? Definitely. Is it useful for determining what is intelligence? Not at all.

            • voracitude@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              edit-2
              7 months ago

              No, I’m not arguing anything other than that our brains receive raw data as inputs because they do. Now since we’re jumping to insults immediately, you can kindly fuck off. Toodle-doo!

              • froztbyte@awful.systems
                link
                fedilink
                English
                arrow-up
                0
                ·
                7 months ago

                Where the fuck was the insult? Wild

                You’re the one making incoherent illogical driveby comments, clown

                • voracitude@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  edit-2
                  7 months ago

                  Attacking me as stupid straight out the gate was the insult, when all i said was “our brains process raw data as inputs”. Falsify that if you want to argue. Now I’m very sorry you’re not capable of understanding the point, but it isn’t my problem. You can fuck off too, because I’m not here to instruct you in the English Comprehension equivalent of doing up your Velcro shoes, you genetic throwback.

              • Soyweiser@awful.systems
                link
                fedilink
                English
                arrow-up
                0
                ·
                7 months ago

                Yes, and that was a stupid argument unrelated to the point made that evolution used this raw data to do things, thus raw data in LLMs will lead to AGI. You just wanted debate points for ‘see somewhere there is data in the process of things being alive’. Which is dumb gotcha logic which drags all of us down and makes it harder to have normal conversations about things. My reply was an attempt to make you see this and hope you would do better.

                I didn’t call you stupid, I called the argument stupid, but if the shoe fits.

                • voracitude@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  7 months ago

                  I didn’t want “debate points”, I wanted to know what you would call sensory inputs if not “raw data”. Completely independent of anything else, which I tried to make clear in my post, the clarity which you completely ignored to accuse me of making a stupid argument. I made very specific effort to distance myself from the argument being made by the other poster, because I wanted to ask the one question and the one question alone, so to be lumped in with it anyway is more than galling.

                  Example: you lot just want to lash out at internet strangers for asking an honest question because it’s in the wrong context as far as you’re concerned. Is that a fair characterisation of your intent? No? Same. So you can take your accusations of intellectual dishonesty and this block, and fuck off.

                • froztbyte@awful.systems
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  7 months ago

                  No no see, since everything is information this argument totally holds up. That one would need to categorize and order it for it to be data is such a silly notion, utterly ridiculous and unnecessary! Just throw some information in the pool and stir, it’ll evolve soon enough!

        • swlabr@awful.systems
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          Not me dawg, I am highly non linear (pls donate to my gofundme for spinal correction)

  • Wirlocke@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 months ago

    I feel like the current Machine Learning gold rush is amazing from a technical perspective, but I feel like a lot of technophiles miss the real potential.

    What we have is the first crickety engines of this technology. We’re not building futurism masterpieces with the equivalent to a steam engine.

    We have great new tools that we can use to further understand and optimize what we built, instead of just throwing more and more compute on top of our first design.

    The human brain uses a light bulb of power, so we know we are massively inefficient. And recent research like MAMBA shows that there’s still more improvements to make.

    And Anthropic’s Mech Interp shows there’s ways to better understand the neural networks to improve performance without relying solely on a “black box”.

    The tech has great potential, but the massive server farms being dedicated to it now is just crypto style overhype and fear of missing out.

    • sinedpick@awful.systems
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      did you even experience a single conscious thought while writing that? what fucking potential are you referring to? generating reams of scam messages and Internet spam? automating the only jobs that people actually enjoy doing? seriously, where is the thought?