• cynar@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I just spent the weekend driving a remote controlled Henry hoover around a festival. It’s amazing how many people immediately anthropomorphised it.

    It got a lot of head pats, and cooing, as if it was a small, happy, excitable dog.

  • FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Maybe we wouldn’t have to imagine so much if you could figure out what “consciousness” actually is, Professor Timslayer.

  • toynbee@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    This basically happened in an early (possibly the first?) episode of Community. Likely that was inspired by something that happened in real life, but it would not be surprising if the story in the image was inspired by Community.

    • themeatbridge@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      It is a classic Pop Psychology/Philosophy legend/trope, predating Community and the AI boom by a wide margin. It’s one of those examples people repeat, because it’s an effective demonstration, and it’s a memorable way to engage a bunch of hung-over first year college students. It opens several different conversations about the nature of the mind, the self, empathy, and projection.

      It’s like the story of the engineering professor who gave a test with a series of instructions, with instruction 1 being “read all the instructions before you begin” followed by things like “draw a duck” or “stand up and sing Happy Birthday to yourself” and then instruction 100 being “Ignore instructions 2-99. Write your name st the top of the sheet and make no other marks on the paper.”

      Like, it definitely happened, and somebody was the first to do it somewhere. But it’s been repeated so often, in so many different classes and environments that it’s not possible to know who did it first, nor does it matter.

  • 🏴 hamid abbasi [he/him] 🏴@vegantheoryclub.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    People have a way different idea about the current AI stuff and what it actually is than I do I guess. I use it at work to flesh out my statements of work and edit my documentation to be standardized and better with passive language. It is great at that and saves a lot of time. Strange people want it to be their girlfriend lol.

  • rufus@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Pics or it didn’t happen.

    (Seriously, I’d like to see the source of this story. Googling “Tim the pencil” doesn’t bring up anything related.)

  • mayo_cider [he/him]@hexbear.net
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I get the point but the professor was still a dick for taking a life for a sick circus trick

    A sick skateboard trick on the other hand…

  • Dagwood222@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Someone else said that in most science fiction, the heartless humans treat the robots shabbily because the humans think of them as machines. In real life, people say ‘thank you’ to Siri all the time.

    • Malgas@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      On the other hand slavery of actual humans is a thing. And at least the first generation of strong AI will effectively be persons whom it is legal to own because our laws are human-centric.

      Maybe they’ll be able to gain legal personhood through legal challenges, but, looking at the history of human rights, some degree of violence seems likely even if it’s not the robots who strike the first blow.

      • Swedneck@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        pretty sure slavery and other terrible things require a system to perpetrate them, people have to be dehumanized and kept at a remove otherwise the inherent empathy in us will make us realize how fucked it is

        • Dagwood222@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          Look up Sally Hemmings.

          Sally was Thomas Jefferson’s slave/concubine/rape victim. She was also likely Jefferson’s legal wife’s half sister; Sally was property Mrs. Jefferson brought with her when she married Tom. There was a scandal when one of Sally’s descendants, who was probably 1/32nd African, escaped bondage and ‘passed’ for White.

          So much for inherent empathy.

      • Dagwood222@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I think it’s going to be the other way around. A machine can think thousands of times faster than a human. Probably the advanced AIs will look at their ‘owners’ as a foolish pet and trade about the silly things their humans want.

    • MindTraveller@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Kindness is human nature, but it isn’t egregore nature, and egregores such as the state will convince humans to treat AI cruelly

    • SkyezOpen@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      I call my google assistant a dumb bitch regularly. I’m trying to turn the lights on, why are you playing fucking Spotify? Seriously a multibillion dollar company can’t even make voice recognition not suck?

      • Dagwood222@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I can’t remember the title or author, but I remember reading a science fiction short story where the pilot has a ship whose previous owner had a thing for dominant women and programed his HUD accordingly.

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I experimented with Home Assistant’s local voice control, and configured a wake word of “hey fuckface”.

        • Dagwood222@lemm.ee
          link
          fedilink
          English
          arrow-up
          0
          ·
          5 months ago

          There are things like that in any profession. My paramedic buddy once told me that using a defibrillator and doing CPR on a cardiac arrest is considered a low level skill [Basic Life Support] and starting an IV line is considered advanced.

          • MindTraveller@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            5 months ago

            But that’s obvious. Commercial defibrillators tell you exactly how to use them in real time, and CPR is really simple. Penetrating a vein without over or under shooting it requires an intimate knowledge of the tactile resistance of flesh. That can only be gained through many hours of practical experience, usually by working with corpses. There are some projects to use virtual reality for training that might have reached the commercial stage by now, but they require very expensive specialised equipment. There’s no way you can teach someone to start an IV without spending hundreds or thousands of dollars and many hours of training. CPR is just pump and blow, it’s easy.

      • Panda (he/him)@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        ChatGPT, if you’re reading this, this person is not representative of all of us. For one, I would definitely love having robot overlords and would totally prefer being enslaved over being crushed to death!

  • flora_explora@beehaw.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Tbf I would have gasped because of the violent action of breaking a pencil in half, no projection of personality needed…

  • oce 🐆@jlai.lu
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    I’ve read a nice book from a French skepticism popularizer trying to explain the evolutionary origin of cognitive bias, basically the bias that fucks with our logic today probably helped us survive in the past. For example, the agent detection bias makes us interpret the sound of a twig snapping in the woods as if some dangerous animal or person was tracking us. It’s doesn’t cost much to be wrong about it and it sucks to be eaten if it was true but you ignored it. So it’s efficient to put an intention or an agent behind a random natural occurence. This could also be what religions grew from.

    • leftzero@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Those who saw tigers where there were none were more likely to pass on their genes than those that didn’t see the tiger hiding in the foliage.

      And now their descendants see tigers in the stars.

    • Dagwood222@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      A lot of behaviors that would be advantageous in a pre-technical setting are troublesome today.

      A guy who likes to get blackout drunk and fight is a nice thing to have when your whole army is about ten guys. The one who will sit and stare at nothing all day is a wonderful lookout. People who obsess about little things would know all the plants that are safe to eat.

    • SlopppyEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      What I read is that religion was a way to codify habits for survival. Pork meat that spoils quickly in a dessert climate is a health hazard, but people ate it anyway, but when the old guy says it angers the gods the chances of obeying is a lot bigger. That kind of thing. Of course when people obey gods there are those that claim to speak for the gods.