• TotallynotJessica@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    3 months ago

    AI is just a portion of a brain at most, not a being capable of feeling pain or pleasure; a nucleus with no will of its own. When we program AI to have a survival instinct, then we’ll have something that’s meaningfully alive.

    • We are experimenting with hierarchies of needs, giving behaviors point values to inform the AI how to conduct itself completing its tasks. This is how, in simulations we are seeing warbots kill their commanding officers when they order pauses to attacks. (Standard debugging, we have to add survival of the commanding officer into the needs hierarchy)

      So yes, we already have programs, not AGI, but deep learning systems nonetheless, that are coded for their own survival and the survival of allies, peers and the chain of command.

      • MBM@lemmings.world
        link
        fedilink
        arrow-up
        0
        ·
        3 months ago

        in simulations we are seeing warbots kill their commanding officers when they order pauses to attacks.

        Wasn’t that a hoax?