• grue@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        arrow-down
        1
        ·
        1 year ago

        Even if those dipshits “opted in,” the rest of us sharing the road sure as Hell didn’t!

      • ours@lemmy.film
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        This isn’t just some email web app that may have a few bugs, it’s putting lives at risk on the road. They shouldn’t be able to just label it a beta, overpromise its capabilities, and neglect any responsibility.

      • abcxyz@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 year ago

        I just can’t understand how regulators all over the world allow these things on the road. How the fuck do you allow the release of potentially deadly (for everyone involved, not just for the user) software en masse for the public to beta test for you… This is not Diablo IV…

  • Flying Squid@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    9
    ·
    1 year ago

    I’m not especially sympathetic to the Tesla drivers this might kill.

    I’m worried about everyone else.

  • asudox@lemmy.world
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    9
    ·
    edit-2
    1 year ago

    It shouldn’t have even been released for normal people to use it in daily life, in real roads full of other cars. This poses a big life risk if you ask me, I hope countries start banning this feature soon otherwise many more other deaths will happen, and Elon somehow will get away with them. What’s so hard about driving a real car manually? Did you all become fatass lazy people that don’t even have the willpower to drive a car? Ridiculous. ML is experimental and for a machine, it’s amazing, but it isn’t as good as a human YET, thus causing life threatening accidents. FSD literally is still in beta, and people are driving full speed in roads with this beta software.

    • dufr@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      1 year ago

      It can’t be used in the EU. It would need to pass a review, Elon have claimed they are close to getting it through but Elon says a lot of things.

      • Echo Dot@feddit.uk
        link
        fedilink
        English
        arrow-up
        16
        ·
        1 year ago

        Self-driving cars are actually only legal in a few countries. And those countries have tests.

        It’s only the United States that just lets anyone do what everyone earth it is that they want, even if it’s insanely dangerous.

        Everywhere else any car company that’s espousing self-driving tech would actually have to prove that it is safe, and only a few companies have managed to do this and even then the cars are limited to predefined areas where they are sure they’re not going to come across difficult situations.

      • tony@lemmy.hoyle.me.uk
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        In its current state it has basically no chance IMO.

        If they’d concentrated in making AP/Highway driving smarter first they might have got that through… there are already rules for that… but cities? I’d love to see the autonomous car that could drive through London or Manchester.

    • Ocelot@lemmies.world
      link
      fedilink
      English
      arrow-up
      12
      arrow-down
      39
      ·
      1 year ago

      Humans did not evolve to drive cars. ML did. It drives consistently with no distractions. It is never tired, drunk, or experiences road rage. It has super human reaction time and can see in a full 360 degrees. It is not about being a lazy fatass it is about safety. Hundreds of people in the US were killed in car accidents just today, and none of them were from self driving cars.

      • Zummy@lemmy.world
        link
        fedilink
        English
        arrow-up
        12
        arrow-down
        1
        ·
        1 year ago

        The article listed 2 life threatening near accidents that were only prevented because the person behind the wheel took over and kicked out FSD. Read the article and then comment.

        • CmdrShepard@lemmy.one
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          4
          ·
          1 year ago

          Hilarious telling them to read the article first when you couldn’t even be bothered to read their question before replying.

          • Zummy@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            ·
            edit-2
            1 year ago

            I read it just fine. He asked for an example of a life threatening accident caused by Full Self Driving. I noted that 2 examples were listed in the article. The ONLY difference was that the driver prevented the accidents by being aware. The FSD was going to cause accidents without intervention. I guess in your would people are supposed to do nothing to avoid a major accident. Hilarious that you want to love FSD driving so much that you’re willing to defend a billionaire who wouldn’t piss on you if you were on fire. Billionaires are not your friends. FSD is BETA feature that doesn’t work properly. Take your love somewhere else and away from my comment because you read it, didn’t understand it, and fired off a reply stating I didn’t do something I did because you can understand me. The next time you want to have a discussion come prepared, or don’t come at all!

            • CmdrShepard@lemmy.one
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              8
              ·
              1 year ago

              Ah the “only difference” in your two examples of life-threatening accidents occurring is that no accident occurred in either example? That’s quite the difference if you ask me… this isn’t a level 4 or 5 system so driver intervention is required. These systems can’t improve without real world testing, meanwhile a hundred people die on the road every single day. I guess you’d prefer more people die on the road from drunk or distracted drivers than have manufacturers roll out solutions that aren’t absolutely 100% perfect even if they’re more perfect than human drivers most of the time.

              Your obessesion with Musk is clouding your judgment. I made no mention of him, nor do I like or defend him. This tech wasn’t built by Musk so who gives a shit about him in this discussion?

              • Zummy@lemmy.world
                link
                fedilink
                English
                arrow-up
                5
                ·
                1 year ago

                I am not obsessed with Musk in any form, but the fact of the matter is when you have FSD systems that fail to do the thing they are supposed to do, then maybe it’s not the best idea to roll them out the entire world. Maybe it’s better to continue with more limited testing. You act as if all drunk driving/distracted will stop when FSD is used and that simply isn’t the care. Many people still use gasoline powered cars and drink and drive even though it’s dangerous to do so. Furthermore, FSD will lead to more distracted driving because people will assume the self driving means the car will take of everything and there is no need to be vigilant.

                The plain truth is that while FSD can be the future, rolling it out despite knowing that it isn’t ready is not the solution it’s irresponsible and will cause harm. The almost accidents that you aren’t concerned with would have most likely killed the driver and probably other people to. Our difference of opinion here is that you believe it’s okay if people die as long the the testing shoes that there is a chance they won’t die in the future and think if anyone dies it’s too much. The feature clearly isn’t ready for prime time and needs more limited real world testing, but the fact of the matter is testing doesn’t bring in money.

                Your inability to ever consider the fact that a worldwide roll out might not be the best idea right now since the testing shows the car isn’t ready shows that you really aren’t arguing in good faith. You have chosen the position that FSD is good and is ready even when confronted with articles like the above show it isn’t. I would wager that a lot of people want the era, of FSD, they just want it when it works. Keep the roll out more limited and do further testing. When mistakes happen, take the time to figure out why and how it can be prevented in the future. You argue testing is needed, but are in favor of a roll out now even though we need lots more limited real world testing. Both can’t be true. Time to think what you really want, because I don’t think you know… And accusing any person who doesn’t want a complete roll out of FSD today of having a bias against Musk shows that.

        • Ocelot@lemmies.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          9
          ·
          edit-2
          1 year ago

          Teslas have 360 degree dashcams that are recording all the time. Why didn’t they upload the video? I promise you they have it.

          Such a video would go viral pretty easily. It would light a fire under tesla engineering to fix such a dangerous and life threatening situation. Where is it? Why is there never any footage attached to these articles? Why can’t I find a video ANYWHERE of such a thing? Why can nobody in this thread bashing the tech over and over produce any justification for their fear?

          If I were tesla and I wanted to cover up dangers of FSD trying to kill people I wouldn’t give everyone a constantly running dashcam. It would really make them look bad.

          Could it possibly be, just maybe, that the video disagrees with the “journalist” opinion that it was performing dangerously? Could it be that an article that says “Tesla FSD performs admirably, swerves to avoid obstacle that would have caused a blowout” might not get nearly as many clicks and ad revenue? Maybe?

          FSD is aware of where barriers and medians are. If it needs to swerve to avoid an obstacle it will go in whatever direction is safest. Sometimes that means towards a barrier. Sometimes the driver panicking and disengaging and taking over interrupts the maneuver and causes danger that wasn’t otherwise present. We will never know what actually happened because there is no evidence. Evidence that I promise you exists but for whatever reason was omitted.

          If a cop said something outrageous and dangerous happened to them and they say they are completely clear of fault and wrongdoing, would it not be reasonable to want to see the bodycam footage? If for whatever reason the police department says “we don’t have it” “its corrupted” or whatever other excuse would that not raise eyebrows? The same situation applies here.

          There are plenty of youtube channels out there like dirtytesla, whole mars catalog, AI Driver, Chuck Cook, and many others that show and even livestream FSD. None of them have been in an accident, even in very early releases of the beta software. These people are comfortable with the beta and often don’t take over control of the vehicle under any circumstances, even in their torture test scenario.

          Is it at all possible, just maybe, that FSD isn’t as dangerous as you might think? Fear is often a result of ignorance.

          I am extremely open to changing my mind here just show me some convincing evidence. Every tesla is recording all the time so it should be really easy to find some, no?

          Im sure im just a Tesla shill or fanboy whatever. The truth is Im just looking for facts. I would like to know why people feel this way and are so afraid of new technology despite overwhelming evidence to the contrary that it is saving lives.

          • wizardbeard@lemmy.dbzer0.com
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            Wow that’s sure a lot of text for someone that didn’t read the article.

            The author states that despite having storage plugged in, he was not given the option to save a recording.

          • Zummy@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            So you are saying that since the author of this article didn’t upload a video of the events he detailed, that means FSD has absolutely no issues and is completely safe for every person on the road to use all time? Seems like quite a leap to me, but what do I know? It seems to me that people here want FSD when it’s ready. You want it now, ready or not. I guess that’s where we disagree. And I don’t really think you are open to anyone changing your mind. I think you picked your position and come hell or high water you’re sticking to it.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              edit-2
              1 year ago

              FSD is not without issues, but yes lots in this thread are implying that FSD is unsafe and causes tons of accidents, which there is absolutely no evidence to back that up. Its just a “Feeling” they have. They believe that it is irresponsible of anyone to use it and doing so puts others at unnecessary risk. People will genuinely believe that I am putting my family and neighbors at serious risk of harm because I’m irresponsible enough to have and use FSD. All I have asked for the entire time is just some kind of evidence that it is dangerous. Anything. Help me understand your view. Please.

              The reason I defend it so much at this point is because it is already demonstrated to be far safer than any average human driver and getting better with every release. With the new V12 and full neural net it is expected to get far smoother and drive even more human like with less code and consuming less power. We have seen massive improvements in the tech just in the past year and the rate in which it gets better continues to accelerate. It is impossible to count how many lives it has saved already through accident avoidance. We don’t need misinformed people bashing and trying to cast doubt and hold back this technology just because they “feel” a certain way about it. You should absolutely criticize valid concerns, but FFS please bring some facts and evidence to the table.

              The reason I am confident why FSD is safe despite “feelings” is how its programmed. For several years prior to even the earliest public beta, the camera and AI system learned how to correctly identify everything on the road. Other cars, pedestrians, dogs, cats, babies, telephone poles, traffic cones, whatever. It is in a state now where it is as accurate as it ever can be and any issues it has are with regards to mislabeling one thing as something else (like a car as a truck, etc). That doesn’t actually matter with self driving because literally the first line of code in FSD is something like: “This is a car, this is a truck, this is a pedestrian, this is a dog and this is where it is, where it is going and how far away it is”… OK? Don’t hit those. And it doesn’t. Everything else comes secondary. It drives like a robot, it obeys traffic laws to a T and that pisses off other drivers, or freaks out whoever is behind the wheel because the car didn’t do exactly what they would have done in that situation and it is therefore wrong, so they had to take over, often times the act of the driver taking over for the car actually puts them in more danger than they were in if they would have just let the car continue the maneuver. This shouldn’t be surprising because humans as a whole SUCK at driving and making decisions like that. It is sometimes unnecessarily cautious around pedestrians (But, honestly how would you want it to behave?) It might suddenly detect a hazard and swerve to avoid it, possibly moving the car into another unoccupied space. It is fully aware of the space it is occupying and fully aware of the space it is about to occupy. And it doesn’t hit anything. There are lots of youtube channels that prove this, they upload regularly and stress test FSD and try to get it into trickier and trickier situations and it never hits anything. It acts indecisively sometimes, and waits for gaps too large in an abundance of caution, but these are the issues that are getting better over time. At no point does it do anything “Unsafe”, especially since wide release of the public beta. Imagine, if you would, a world where all cars are like this. The most dangerous part of driving right now, FSD or not, is other drivers. The more people we have using it who understand it and are comfortable with it, the better it gets and our roads get safer and safer. I really don’t care how you feel about Elon, he deserves every bit of hate that is sent his way, but FFS please take a look at FSD for what it is and what it is becoming. If it helps you feel any better he was not personally responsible for writing a single line of code or designing any of the components of the system.

              All I’ve gotten to “back up” the claims that it is dangerous here is 3 different articles referencing the exact same incident (the bay bridge pile-up). The video clearly shows the car coasting (regen) to a stop and just sitting there. Had emergency braking been engaged, the hazards would have been turned on, and the car would have stopped a lot quicker. FSD never, ever has had any history or incident of completely stopping in a lane. Any complaints about “Phantom Braking” are usually where the car slows down due to a detected hazard which may or may not be present. There is no evidence of this ever happening anywhere else. 500k of these cars on the road and no other similar reports. Is that a fault of the software, or is it more likely some kind of user error? From my standpoint, having actually used FSD for several years I can tell you with complete certainty that the car would never behave like that and there are far too many red flags in that video to reasonably cast blame on the software. Of course, we will see what plays out in the court case once it is completed, but in my professional opinion, the driver clearly disengaged FSD and allowed the car to come to a complete stop on its own and did nothing to move the car out of the way, it had absolutely nothing to do with the software. I’m 100% open to disagreement on that and am curious as to what a civilized discussion on it would sound like and what someone else thinks is happening here, but so far it just turns into a flame war and I get called a deluded fanboy, even being called a liar and other names. No evidence, no discussion, only anger.

              Again, here is my point. If FSD is as dangerous as others are implying then we should see tons of accidents. Given that every single one of these cars has a constantly running 360 degree dashcam, we should see some evidence, right? Maybe not from this specific case, maybe there’s a valid reason why they couldn’t upload it. But, surely with half a million cars on the road and many millions of miles traveled collectively, and more and more teslas hitting the road every day we should at least see something by now, right? There are tons and tons of videos of teslas avoiding accidents, but nobody wants to mention or talk about those. People are focusing all of their energy into one highly suspect negative with nothing to back it up, holding back technology and safety and refusing to have any sort of civilized discussion around it. Their entire perception on how this technology works is restricted to a few clickbait headlines where they didn’t even bother to read the article. They come here and confidently proclaim that they know for a fact it doesn’t work and will never work and how dare you even try to bring any facts to the table. Its as if the discussion is being led by children. Its not productive, not based on any sort of facts and doesn’t go anywhere, and we’re all confused and less informed as a result. For example, someone posts an accident that occurred in 2018 as evidence that full-self-driving doesn’t work, when FSD didn’t even exist until 2021. If you point that out, there’s no concession, there’s no rebuttal, there’s only anger. Pointing out simple, easily verifiable facts makes you a Tesla fanboy and therefore any opinion or input you may have on the matter is invalid. Your mind is already made up and you will never see our point of view! No, I don’t currently see your point of view because you don’t have even the most basic facts straight.

              • Zummy@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                It’s clear from what you wrote that you want FSD to be as good as it can and I think we can get there but we aren’t there yet. You say there hasn’t been any reports of any accidents with FSD save for one, but I don’t know if that’s true and that would require some serious research on my behalf to evaluate that. First, I don’t know the number of people that have a car capable of doing FSD driving, from your reply you said 500k on the road, but provided no evidence so I can’t say that’s true without independent evaluation. Second, I have no knowledge of how many of those cars use FSD. It may be a bunch, but it may not. You don’t say and I don’t know. Now there may be far less accidents with FSD, but if the number of people of people on the road in Q1 is 286 million just in the US (https://www.statista.com/statistics/859950/vehicles-in-operation-by-quarter-united-states/ ), and the number of vehicles using FSD every single day all the time for every single drive, it would stand to reason there are far less accidents because there are far less car. You also mention that it has become good at being able to detect objects and I think it has, but being able to detect objects and being able to avoid getting accidents when there are 286 million FSD driving cars on the road that FSD exclusively every single time the vehicle is in use are two different things.

                The fact is, I do want FSD to be a thing, but when I see article written by someone who says that two times they had to take over for the car so it didn’t kill the driver or others, I start to worry that FSD isn’t ready. And frankly although there are YouTube channels that are about electric vehicles that haven’t brought up accidents ever, I wonder if they have a reason not to. I’m not sure. Also, I can’t say the big YouTube channels have never talked about this because I haven’t watched every video they’ve ever posted. And I would have to do that to know if your correct.

                I see that you are passionate about FSD, and I think your passion makes you overlook the real discussion going on. People, and certainly not all people, generally want FSD to be a thing for the reasons you stated, but they want to make sure the cars are safe when they are. And I get that you take a risk every time you drive a car, but the fact of the matter is from reading this article I get the sense that FSD isn’t ready to implemented for every person with a drivers license to use. It sounds like the author knew what to do because he had been driving for some time. If he hasn’t, I think the situation could have been very different.

                You talk about the car not doing exactly what they would have done, but the in articles case it was going to crash. I don’t think anyone would have done that. If the car was able to detect the object, why was it going to crash into it? That is something that would need to be investigated. You argue that people talk about FSD being removed/cancelled because people have a feeling it isn’t good, but I haven’t seen that in droves. I’ve seen several people say that they think FSD needs more testing and more limited roll out.

                I know I didn’t hit all your points, but they were quite numerous. I want full self driving, but I want it to be reliable. And I think if articles like this are written we just aren’t there yet. Yes, keep it coming, but be real about its current limitations.

      • Chocrates@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 year ago

        Self driving is not there, and it may never get there, but you are right. We can save so many lives if we get this right.

        I dont know is Musk is responsible enough to be the one to get us there though.

    • sdoorex@slrpnk.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Well, this article is written by FredTesla who use to mod the TeslaMotors subreddit. Not only did he drink the koolaid, he brewed the damn stuff.

  • sdf05@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    1
    ·
    1 year ago

    This is like that show “Upload”; the guy literally gets killed by a car

    • III@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      1
      ·
      1 year ago

      You should finish watching that first episode before making such bold statements.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        9
        arrow-down
        5
        ·
        1 year ago

        I mean I think its still a valid point. The car in the show was sabotaged, and that is definitely something that might be a thing once all cars self-drive. Especially once they remove controls like steering wheels.

        There hasn’t been a tesla FSD hack yet, but it would take spoofing a software update (and spoof the authentication and certs, etc)… The attacker would need to have access to a pretty massive supercomputer to make their own custom self-driving software and today getting the certs and everything right is next to impossible… but even then its only next to impossible, not impossible.

          • jabjoe@feddit.uk
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            “Attack surface” is the term you want. Big software means big attack surface. So keep code lean for security as well as efficiency.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            3
            ·
            1 year ago

            There are still a lot of other layers that need to be compromised past the cert for such an attack to even be possible. Even so, I suspect when such an attack does happen it will probably be for stealing cars. Your car would just wake up in the middle of the night and drive itself somewhere else to be cut up for parts. Less likely is any kind of safety issue since its so easy to take over control of the car.

        • 8ender@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          3
          ·
          1 year ago

          Don’t even need sabotage. You already share the road with cars that someone repaired under a tree with the cheapest parts they could find.

          • reddithalation@sopuli.xyz
            link
            fedilink
            English
            arrow-up
            4
            ·
            1 year ago

            oh look more anti right to repair sentiment.

            no, cars repaired by people other than the manufacturer wont kill you

            • 8ender@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              I’ve been repairing cars for over 15 years. There’s a massive spectrum for quality on almost any aftermarket replacement part. Literally the same part can range from $50 to $400 and the only difference is quality and durability.

              Sometimes the cheap part is fine, sometimes they cause weird problems. Especially electrical parts.

              • reddithalation@sopuli.xyz
                link
                fedilink
                English
                arrow-up
                1
                ·
                1 year ago

                yeah sure, but that is unlikely to kill you or someone else, and diy repair is almost always good for the consumer

    • jabjoe@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      Well hold on there, he survived the crash, and would probably have been ok. It was the upload that killed him.

      • sdf05@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        Yeah, my bad 🤣 I meant the car technically endangered him to not live longer 😔

  • Mockrenocks@lemmy.world
    link
    fedilink
    English
    arrow-up
    6
    ·
    1 year ago

    Frankly, it speaks incredibly poorly to the NHTSA that this kind of behavior is allowed. “Beta testing” a machine learning driving assistance feature on active highways at 70+ miles an hour is a recipe for disaster. Calling it Full-Self Driving while also not having guardrails on its behavior is false advertising as well as just plain dangerous.

  • fosforus@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    2
    ·
    edit-2
    1 year ago

    I like my Tesla but there’s no way I’ll be switching that thing on. They’re even calling it beta, what the fuck do people think that means?

  • megalodon@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    5
    ·
    1 year ago

    FFS. He was testing a beta update at 73 miles per hour. Is he really expecting sympathy?

    • spezz@lemmynsfw.com
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      Maybe it shouldnt be released for real world use with such major bugs then. Dont give me the crap that iTs DiFfErEnT because tesla is a “technology company” either. Its a car, safety features on it should work damn near 100% of the time before it is released.

    • SomeRandomWords@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I thought all FSD updates were beta updates? Did I miss the announcement of FSD going GA and being stable?

      If that’s the case, then yeah I probably wouldn’t test run a new update on the highway first. But I also have no idea if this issue happens at lower speeds as well.

        • SomeRandomWords@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          Yes, 100%. Anyone is a fool to use Tesla “FSD Beta” pretty much anywhere. But Tesla markets it as totally safe to use anywhere and everywhere (but especially highways) so there’s a point where you have to stop calling everyone that owns a Tesla a fool and acknowledge that the common denominator is Tesla and just not the owner’s foolishness.

          • megalodon@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            I didn’t call everyone that owns a Tesla a fool. I questioned whether someone who decides to risk their life to test a feature still in beta deserves sympathy.

  • Ocelot@lemmies.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    71
    ·
    1 year ago

    Electrek has a long history of anti tesla clickbait. Take this with a grain of salt.

    Teslas are factory equipped with a 360 degree dashcam yet we never see any footage of these alleged incidents.

    • silvercove@lemdro.idOP
      link
      fedilink
      English
      arrow-up
      38
      arrow-down
      6
      ·
      1 year ago

      Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.

    • kinther@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      2
      ·
      1 year ago

      Wishful thinking that Tesla would publicly distribute footage of an accident caused by one of their cars…

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        10
        ·
        edit-2
        1 year ago

        Its saved on to a thumb drive. any user can pull off and use or post the footage anywhere. It never gets uploaded to tesla, only snapshots and telemetry.

        lol the anti tesla crew will downvote even the most basic facts.

        • kinther@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 year ago

          But is it technically the user’s data, or is there some clause in Tesla car ownership that says it is Tesla the company’s data?

          Forgive me I’m ignorant of the fine details. I purchased a Chevy Bolt but had been looking into a Tesla as an alternative until Elon tried to be the super-cool Twitter guy.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            It’s definitely the users data. There are a few tesla dashcam channels out there loaded with footage of other drivers acting like idiots.

    • BargsimBoyz@lemmy.world
      link
      fedilink
      English
      arrow-up
      17
      arrow-down
      4
      ·
      1 year ago

      Given your posts and rampant Tesla fanboyism, I honestly wouldn’t be surprised if you’re Elon himself just anxiously trying to save face.

      Then again, Elon would just publicly sprout misinformation about it all so it probably isn’t. Still, surprising that people are just so obsessed with Tesla they can’t take the bad with the good.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        10
        ·
        1 year ago

        all im asking for is some evidence of the bad. Nobody can provide it. It really shouldn’t be that hard.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            4
            ·
            edit-2
            1 year ago

            Please let me know where I stated anything inaccurate in the comment about the single incident that has been dug up in the 500k FSD cars and millions of miles traveled self-driving.

            Also lets please keep this civil and not be name-calling. I hate Elon as much as anyone else and he deserves pretty much all the hate he gets. However it doesn’t change facts. Its not like he was responsible for writing even a single line of code in FSD or even designed or built any of the cars himself.

    • Astroturfed@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      1 year ago

      Ah yes, there’s no readily available footage of the dead bodies flying into the street or being crushed under the wheels so it’s made up. Of course.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        15
        ·
        1 year ago

        not all accidents are that violent. I would even accept a simple fender bender. Those should be pretty common if FSD is dangerous as a lot of people are implying, right?

        • Astroturfed@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          arrow-down
          1
          ·
          1 year ago

          Look man, I don’t like children either but wanting more child mowing cars out on the road is pretty twisted.

    • Dr. Dabbles@lemmy.world
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      4
      ·
      1 year ago

      Bud, we’ve seen literally thousands of videos of this happening, even from the Tesla simps. You’re seven years behind on your talking points.

        • Dr. Dabbles@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          1 year ago

          The first public release was much later than the smaller beta, which I had access to. And my reference to seven years was Josh Brown being killed by autopilot in 2016.

      • Ocelot@lemmies.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        11
        ·
        1 year ago

        Can you link a few? Something where FSD directly or indirectly causes an accident?

        • Dr. Dabbles@lemmy.world
          link
          fedilink
          English
          arrow-up
          11
          arrow-down
          3
          ·
          1 year ago

          You’re working very hard in this thread to remain in the dark. You could take two seconds to look for yourself, but it seems like you won’t. Hell, they performed a recall because it was driving through stops. Something it’ll still do, of course, but they performed a recall.

          • Astroturfed@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            ·
            1 year ago

            Elon literally had to hit the brakes manually in a Livestream of the self driving tech as the car was going to go strait through a red light. Like less than a week ago… SOOOO safe, all the news stories of it killing people are fake!

                • CmdrShepard@lemmy.one
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  1 year ago

                  And that’s an article about Autopilot which is a completely separate system. For someone with such strong opinions, you sure seem to lack even a basic understanding of the technology that you’re discussing here, but I’m sure you’ll just pull out more insults and keep making references to your current obsession, Musk, as if that makes your argument any more credible or factual.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              2
              arrow-down
              9
              ·
              edit-2
              1 year ago

              The early alpha build not part of public release? That video? The one with the known regression in the model S?

              That video was a demo of the new FSD beta 12 software, which is the first time a neural network was in complete control of the car, resulting in a massive reduction in code and overall smoothness. Did I mention the part where it was unreleased to the public? Maybe there’s a reason for that?

              Other than that the car performed flawlessly in the entire 40 minute drive.

          • Ocelot@lemmies.world
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            9
            ·
            edit-2
            1 year ago

            The recall was most definitely not for “driving through stops”. It was to fix the behavior of doing a “rolling stop”, which is something 99.5% of drivers do, which is how it learned to do that. Where do you see that it still does not make a complete stop at stop signs?

            https://www.forbes.com/sites/bradtempleton/2022/02/01/feds-make-tesla-remove-rolling-stops-its-a-terrible-decision/?sh=67b344722111

            I’m not trying to remain in the dark here, I’m just presenting facts. I’m very open to change my mind on this situation entirely just give me the facts. You said there were thousands of these videos I’m just asking for evidence. I just get downvoted and nobody posts any of the evidence.

            Im an AI professional and have been an FSD beta tester for almost 3 years with tens of thousands of miles logged. How can I possibly be the one “in the dark” here?

            • Dr. Dabbles@lemmy.world
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              2
              ·
              1 year ago

              “rolling stop”

              Or put another way by someone not desperate for Elon’s attention, not stopping. Driving through stops.

              Where do you see that it still does not make a complete stop at stop signs?

              Signs, lights, it’ll gladly not stop for any of them. Where do I see it? Real life. Actually owning one of these foolish gadgets for 5 years. Where do you see your examples?

              Also, don’t send me brad templeton opinion pieces, he’s a complete hack and has outed himself as such many times. He does have a nice video explaining what he thinks of Tesla stans like you though. Did you watch that one, or do you only link his material when it’s convenient?

              I’m just presenting facts

              No you aren’t, you’re presenting a curated social media marketing campaign. Congrats, you fell for the ad. Do you think that beer is going to make you more attractive, too?

              I’m very open to change my mind on this situation entirely

              Ok. Tell us what evidence it would take for you to completely change your mind on this and realize Elon is a hack, running a dangerous con with low quality software being released to cars in the US and Canada? What evidence would you require to change your mind and accept that Tesla doesn’t properly test releases before they go out to customers?

              I’m an AI professional

              This has absolutely zero bearing on anything except that you’re probably extremely susceptible to Elon’s outright lies.

              have been an FSD beta tester for almost 3 years

              Doubt.

              How can I possibly be the one “in the dark” here?

              The term is “delusion”.

              • Ocelot@lemmies.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                7
                ·
                1 year ago

                Lets not resort to name calling or personal attacks here. You stated there are “Thousands of videos” of FSD related accidents, I only asked for a few examples. Please tell me where it is you’re getting this information. Help me change my mind.

                Do you understand what a “rolling stop” is? It is when you don’t come to a complete stop at a stop sign, you slow down to 0.5 or 1mph, check both ways and move through. This has been studied time and time again that practically NOBODY on the road comes to a full and complete stop at stop signs. That is how the FSD beta was working in earlier releases because thats how it learned to drive. NHTSA said they had to come to a complete stop so Tesla fixed it. You again said that Teslas were still rolling through stop signs and I’m once again asking where you got that information?

                “Actually owning one of these foolish gadgets for 5 years” I’m guessing you’re trying to say you own a Tesla? You clearly don’t have FSD because if you did you’d know that it makes full stops 100% of the time. I certainly have doubts you actually do own a tesla because if someone spends 40-60 grand on something they consider a “Foolish Gadget” why on earth would they hold on to it for so long? Just sell it and get something else, move on with your life and don’t bash people who like their cars.

                I’ve asked you, now 3 times now to present evidence. Video evidence of FSD doing dangerous things. Given that all teslas have 360 dashcams that are constantly recording and we live in an age of such ease of video sharing that really shouldn’t be a big ask if FSD is as dangerous as you’re implying. These incidents should be happening daily. That is what would change my mind. What would change your mind?

                I bought my model Y in 2020 with FSD and emailed tesla for early beta access based on my engineering experience and the part of the country I’m located in. They granted it almost a year later and I’ve been driving with it almost every day since. Why on earth would you doubt that? Do you need some kind of evidence?

                • Flying Squid@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  5
                  ·
                  1 year ago

                  Do you understand what a “rolling stop” is?

                  I sure do. I got pulled over for doing one. Because they’re not legal.

                • Dr. Dabbles@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  6
                  arrow-down
                  2
                  ·
                  1 year ago

                  Lets not resort to name calling or personal attacks here

                  If you’re going to be a liar, I’m going to call you one.

                  Someone has already provided you samples and you contorted yourself trying to deny their existence still. That shows the caliber of person you are.

                  Do you understand what a “rolling stop” is?

                  Do you understand what a red light and a stop sign are? See, traffic control devices are to be obeyed properly, and creating software that intentionally breaks the law is irresponsible at best. Only a clown would attempt to defend this. Meanwhile, you’re ignoring Elon’s own video from a week ago because it instantly disproves your insane position.

                  I’m guessing you’re trying to say you own a Tesla?

                  I did. I learned my lesson and am a proud one-and-done former tesla owner.

                  You clearly don’t have FSD

                  I did. Swing and a miss.

                  it makes full stops 100% of the time

                  Except, you know, the fucking recall proves it didn’t. And it still doesn’t after the recall. And of course, it misses traffic control devices frequently, ignores them at speed, attempts to pull through them when stopped, etc. Please, do yourself a favor and end this now. Lying to me isn’t going to work.

                  if someone spends 40-60

                  2018 P3D with performance package. More like 70+, with EAP from the factory, and the $2k FSD upgrade when Elon was busy being an idiot about pricing. If you have any questions for someone that’s actually owned one, I’d be glad to answer them for you.

                  why on earth would they hold on to it for so long?

                  Waiting for my replacement.

                  don’t bash people who like their cars.

                  I didn’t. I bashed you for being a liar.

                  I’ve asked you, now 3 times now to present evidence.

                  I asked what evidence would change your mind, and I see you entirely dodged that question. Because there is none. There’s nothing that would change your mind, because your mind is made up. It’s religion, and you don’t convince someone their religion is nonsense. I’m not surprised, of course. All liars behave like this- they pretend there’s something that could completely shift their world view, and change a core piece of their identity… like simping for Musk. But deep down, they know. There’s no such evidence. The racism, the sexual assaults, the financial grift, the hard right bullshit, the transphobia and homophobia, none of that changes your mind. The untested nature of AP and FSD, the release of “smart” summon that immediately started crashing into things, the fact they sent engineers down to Chuck Cook’s intersection for three months to program a single behavior. None of that sinks in when you believe in the religion of Tesla.

                  I bought my model Y in 2020

                  lmao, so absolutely didn’t have FSD longer than me. Delightful. Hysterical and delightful.

    • LibertyLizard@slrpnk.net
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      4
      ·
      1 year ago

      Hilariously I’ve also seen them accused of a pro-Tesla bias. Personally I think they are pretty balanced.

      • Dr. Dabbles@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        They are for sure not balanced. Alfred might have become more realistic about Elon and his bullshit for guarantee he would never get his roadster. That doesn’t mean he’s balanced.

            • LibertyLizard@slrpnk.net
              link
              fedilink
              English
              arrow-up
              7
              arrow-down
              1
              ·
              edit-2
              1 year ago

              Honestly the quality of journalism in this article is pretty low. Some of the points are valid but most are just nitpicks about little opinion pieces at the ends of the articles. I don’t find these particularly valuable, and they sometimes contain some bad takes as pointed out here, but that’s not an issue of factual reporting. So the worst they’ve identified is a few minor omissions which, sure, but if you write thousands of articles that’s going to happen.

              And by the way, this article is making the case that Electrek is deliberately biased towards Tesla, not away from them. So if anything it undermines your point.

              I think the scandal about car referrals was pretty suspicious, but again, when you look at their reporting it comes down as pretty balanced. Perhaps you could argue they talk too much about Tesla but they cover the good and the bad. And I would say almost everyone in America has been talking about Tesla too much for quite some time.