Tesla knew Autopilot caused death, but didn’t fix it::Software’s alleged inability to handle cross traffic central to court battle after two road deaths

  • gamer@lemm.ee
    link
    fedilink
    English
    arrow-up
    11
    ·
    1 year ago

    I remember reading about the ethical question about the hypothetical self driving car that loses control and can choose to either turn left and kill a child, turn right and kill a crowd of old people, or do nothing and hit a wall, killing the driver. It’s a question that doesn’t have a right answer, but it must be answered by anybody implementing a self driving car.

    I non-sarcastically feel like Tesla would implement this system by trying to see which option kills the least number of paying Xitter subscribers.

    • Liz@midwest.social
      link
      fedilink
      English
      arrow-up
      8
      ·
      1 year ago

      At the very least, they would prioritize the driver, because the driver is likely to buy another Tesla in the future if they do.

    • CmdrShepard@lemmy.one
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      I think the whole premise is flawed because the car would have had to have had numerous failures before ever reaching a point where it would need to make this decision. This applies to humans as we have free will. A computer does not.

    • Ocelot@lemmies.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      3
      ·
      1 year ago

      Meanwhile hundreds of people are killed in auto accidents every single day in the US. Even if a self driving car is 1000x safer than a human driver there will still be accidents as long as other humans are also sharing the same road.