• MotoAsh@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      7 months ago

      No, because LLMs are just a mathematical blender with ONE goal in mind: construct a good sentence. They have no thoughts, they have no corrective motion, they just spit out sentences.

      You MIGHT get to passing a Turing test with enough feedback tied in, but then the “conciousness” is specifically coming from the systemic complexity at that point and still very much not the LLMs.

      • Hotzilla@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        In my opinion you are giving way too much credit to human beings. We are mainly just machines that spit out sentences.

        • MotoAsh@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          7 months ago

          No, you are giving too much credit to LLMs. Thinking LLMs are capable of sentience is as stupid as thinking individual neurons could learn physics.

      • maynarkh@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 months ago

        So you’re saying it’s not good enough for a sentient personality, but it might be good enough for an average politician?