• silvercove@lemdro.idOP
    link
    fedilink
    English
    arrow-up
    38
    arrow-down
    6
    ·
    1 year ago

    Are you kidding me? Youtube is full of Tesla FSD/Autopilot doing batshit crazy things.

      • zeppo@lemmy.world
        link
        fedilink
        English
        arrow-up
        23
        arrow-down
        4
        ·
        1 year ago

        Musk just did a 20 minute video that ended with it trying to drive into traffic.

          • zeppo@lemmy.world
            link
            fedilink
            English
            arrow-up
            18
            arrow-down
            2
            ·
            1 year ago

            The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.

            • Ocelot@lemmies.world
              link
              fedilink
              English
              arrow-up
              5
              arrow-down
              19
              ·
              edit-2
              1 year ago

              Here’s the specific timestamp of the incident you mentioned in case you wanted to actually see it: https://youtu.be/aqsiWCLJ1ms?t=1190 The car wanted to move through the intersection on a green left turn arrow. I’ve seen a lot of human drivers do the same. In any case, its fixed now and never was part of any public release.

              The video didn’t end there, it was near the middle. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…

              This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.

              • midorale@lemmy.villa-straylight.social
                link
                fedilink
                English
                arrow-up
                11
                arrow-down
                2
                ·
                1 year ago

                Other than that it performed flawlessly for over 40 minutes in a live demo.

                I get that this is an alpha, but the problem with full self driving is that’s way worse than what users want. If chatgpt gave you perfect information for 40 minutes (it doesn’t) and then huge lies once, we’d be using it everywhere. You can validate the lies.

                With FSD, that threshold means a lot of people would have terrible accidents. No amount of perfect driving outside of that window would make you feel very happy.

                • Ocelot@lemmies.world
                  link
                  fedilink
                  English
                  arrow-up
                  3
                  arrow-down
                  11
                  ·
                  edit-2
                  1 year ago

                  You realize that FSD is not an LLM, right?

                  If its “Way Worse” then where are all the accidents? All teslas have 360 dashcams. Where are all the accidents?!

                  • midorale@lemmy.villa-straylight.social
                    link
                    fedilink
                    English
                    arrow-up
                    5
                    arrow-down
                    2
                    ·
                    1 year ago

                    I didn’t say FSD was an LLM. My comment was implementation agnostic. My point was that drivers are less forgiving to what programmatically seems like a small error than someone who is trying to generate an essay.