Warning: Some posts on this platform may contain adult material intended for mature audiences only. Viewer discretion is advised. By clicking ‘Continue’, you confirm that you are 18 years or older and consent to viewing explicit content.
Here’s the specific timestamp of the incident you mentioned in case you wanted to actually see it: https://youtu.be/aqsiWCLJ1ms?t=1190
The car wanted to move through the intersection on a green left turn arrow. I’ve seen a lot of human drivers do the same. In any case, its fixed now and never was part of any public release.
The video didn’t end there, it was near the middle. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…
This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.
Other than that it performed flawlessly for over 40 minutes in a live demo.
I get that this is an alpha, but the problem with full self driving is that’s way worse than what users want. If chatgpt gave you perfect information for 40 minutes (it doesn’t) and then huge lies once, we’d be using it everywhere. You can validate the lies.
With FSD, that threshold means a lot of people would have terrible accidents. No amount of perfect driving outside of that window would make you feel very happy.
I didn’t say FSD was an LLM. My comment was implementation agnostic. My point was that drivers are less forgiving to what programmatically seems like a small error than someone who is trying to generate an essay.
Maybe so, but from where I stand the primary goal should be “Better driver than a human” which is an incredibly low bar. We are already quite a ways past that and its getting better with every release. FSD is today nearly 100% safe, most of the complaints now are around how it drives like a robot by obeying traffic laws, which confuses a lot of other drivers. There are still some edge cases yet to be ironed out extensively like really heavy rain, some icy conditions and snow. People are also terrible drivers in those conditions so its not a surprise. It will get there.
it has to perform flawlessly 99.999999% of the time. The number of 9s matters. Otherwise, you are paying some moron to kill you and perhaps other people.
ok so im totally in agreement but 99.999999% is one accident per hundred million miles traveled. I dont think there should be any reasonable expectation that such a technology can ever possibly get that far without real world testing. Which is precisely where we are now. Maybe at 4 or 5 9s currently.
If you do actually want to have that level of safety, which lets be honest we all do, or ideally 100% safety, how would you propose such a system be tested and deemed safe if not how it’s currently being done?
I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.
First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.
I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.
I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.
The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.
Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.
OK, now that being said, lets dig in:
November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge
I’m from the area and have driven this exact spot hundreds of times on FSD and have never experienced anything even remotely close to what is shown here
“Allegedly” with FSD engaged
Tesla FSD “phantom” braking does not behave like this, and never has in the past. Teslas have 360 degree vision and are aware of traffic in front of and behind them.
Notice at the beginning of the video that this car was in the process of a lane change, this introduces a couple of possibilities as to what happened here, namely:
Teslas do have a feature under autopilot/FSD that if after multiple warnings for the driver to pay attention and no engagement, the car will slow down and pull over to the shoulder and stop. This particular part of the bay bridge does not have a shoulder, so it stopped where it is. This seems unlikely, since neural networks are very capable of identifying what a shoulder is and that its in an active lane of traffic, and even with tesla’s massive fleet of vehicles on FSD there are no other recorded instances of this happening anywhere else.
This particular spot on the bay bridge eastbound has a very sudden and sharp exit to Yerba Buena Island. What I think happened is that the driver was aiming for this exit, saw that they were about to miss it and tapped the brake and put on the turn signal not realizing that they just disengaged FSD. The car then engaged regen braking and came to a full stop.
When a tesla comes to a full stop automatically (an emergency stop) it puts the hazards on automatically. This has been a feature since the v1 autopilot days. This car’s hazards do not come on after the stop.
What seems especially weird to me is that the driver continued to let the car sit there at a full stop while traffic piled up behind them. In FSD you are always in control of your own car and all it would have taken is tapping the accelerator pedal to get moving again. FSD will always relinquish control over the car to you if you tap the brakes or grab and move the steering wheel hard enough. Unless there was some mechanical issue that brought the car to a stop and prevented it from moving, in which case this is not the fault of the FSD software.
Looking at how quickly (or lack thereof) the car slowed down this seems to very clearly be the car using regen braking, not emergency braking. I’m almost positive this means that FSD was disengaged completely.
We don’t have all the facts on this case yet and I’ll be anxious to see how this plays out in court but there are definitely many red flags on this one that have me questioning what actually happened here, but I doubt if FSD has anything to do with it.
If my earlier point is true this is actually an instance of an accident being caused because the driver disengaged self-driving. The car would have been much safer if the driver wasn’t even there.
April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet
This one is a favorite among the tesla hate community. Understandably so.
Smart summon has 0 to do with FSD or even autopilot. It is a party trick to be used under very specific supervised processes
Smart summon relies exclusively on the front camera and ultrasonic sensors
While smart summon is engaged, the user still has full control over their car via the phone app. If the car does anything unexpected you only need to release your finger from the button and the car stops immediately. The “driver” did not do this and was not supervising the car, the car did not see the jet because it was entirely above the ultrasonic sensors, and as I’m sure you can understand the object recognition isn’t exactly trained on parked airplanes.
The app and the car remind the driver each and every time it is engaged that they need to be within a certain range and within eyesight of the car to use it. If you remote control your car into an obstacle and it causes an accident, its your fault, period.
Tesla is working on a new version of smart summon which will make this feature more useful in the future.
February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety
I suggest actually watching the video here. What happened is highly at odds with what is actually in the video, but the vid is just over an hour long so I bet most people don’t bother watching it.
“It wouldn’t have hit them, it definitely wouldn’t have hit them. Do we need to cut that?” “No, you can keep it in”
If you look at what was happening on the car’s display, it detected someone entering the crosswalk and stepping out into traffic on the left side. The car hit the brake, sounded an alert and swerved to the right. There was a bicycle in front of where the car swerved but at no point was it about to “nearly take out a bicyclist”. It did definitely overreact here out of safety but at no point was anyone in danger.
Relatively speaking this is a very old version of FSD software, just after the first wave of semi-public release.
December 6, 2021: Tesla accused of faking 2016 Full Self Driving video
lol
March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car
Now we’re getting into pre-FSD autopilot. See above comments about the capabilites of autopilot. Feel free to compare these to other cars LKAS systems. You will see that there are still lots of accidents across the board even with LKAS. That is because it is an assist system and the driver is still fully responsible and in-control of the car.
June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck
Again, pre-FSD. If the driver didn’t see the overturned truck and disengaged to stop then I’m not sure how anyone expects a basic LKAS system to be able to do that for them.
March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes
This one involves a fatality, unfortunately. However, the car was not self-driving. There is something else very important to point out here:
The feature that allows Teslas to change lanes automatically on the freeway (Navigate on Autopilot) was not released until a year after this accident happened. That means, that if AP was engaged in this accident, the driver deliberately instructed the car via engaging the turn signal to merge into that truck.
May 7, 2016: First known fatality involving Tesla’s Autopilot system
Now we’re getting way back into the V1 autopilot systems which weren’t even made by tesla. This uses a system called MobilEye and is made by a third party and is even less capable than V2 autopilot
So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.
Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software by trade” isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.
“The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s “Full Self-Driving” software from around 12,000 vehicles to almost 400,000 in about a year”
What’s fishy about it? You realize 40,000 people die every year from car accidents, meaning 110 die every single day, and you’re referencing 17 fatalities spread out over a few years as some big crisis. This tech (from any manufacturer) isn’t going to prevent 100% of accidents, and there’s not much you can do when drivers willingly drive their car into the side of a semi just like they did before this technology existed.
I won’t argue AP, FSD, or any other system doesn’t have it’s issues but most of these responses are overblown sensationalism.
I am not a “Software by trade” that was a typo. Believe it or not I wrote that entire thing on mobile.
Correlation does not equal causation. Tesla sold a huge number more vehicles in the past 2 years than ever before. Also in 2019,2020 and part of 2021 not a lot of people were driving due to the pandemic.
And, yes, a lot of the first incident I covered there was mostly anecdotal or what I think is happening. Importantly, what I think is happening as someone with years and tens of thousands of miles of experience using FSD beta. I do not have the facts and also importantly, neither do you. I am interested to see what comes out of that court case, but from where I sit I do not think FSD was involved at all.
Please let me know where I have misrepresented facts, I will either correct them or cite sources.
Again, Teslas come with a factory installed 360 dashcam. It records all the time. Where are all of the videos of these FSD related incidents?
ffs that is the exact same article again. Please read my other comment (the huge one) let me know if anything doesn’t make sense or you find anything factually inaccurate.
so can you provide a link of an accident caused by FSD?
Musk just did a 20 minute video that ended with it trying to drive into traffic.
this one? Where does it drive into traffic? https://youtu.be/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
The video ended when he made an “intervention” at a red light. I’m not watching whatever link that is because I’m not a masochist.
Here’s the specific timestamp of the incident you mentioned in case you wanted to actually see it: https://youtu.be/aqsiWCLJ1ms?t=1190 The car wanted to move through the intersection on a green left turn arrow. I’ve seen a lot of human drivers do the same. In any case, its fixed now and never was part of any public release.
The video didn’t end there, it was near the middle. What you’re referring to is a regression specifically with the HW3 model S that failed to recognize one of the red lights. Now I’m sure that sounds like a huge deal, but here’s the thing…
This was a demo of a very early alpha release of FSD 12 (current public release 11.4.7) representing a completely new and more efficient method of utilizing the neural network for driving and has already been fixed. It is not released to anyone outside of a select few Tesla employees. Other than that it performed flawlessly for over 40 minutes in a live demo.
I get that this is an alpha, but the problem with full self driving is that’s way worse than what users want. If chatgpt gave you perfect information for 40 minutes (it doesn’t) and then huge lies once, we’d be using it everywhere. You can validate the lies.
With FSD, that threshold means a lot of people would have terrible accidents. No amount of perfect driving outside of that window would make you feel very happy.
You realize that FSD is not an LLM, right?
If its “Way Worse” then where are all the accidents? All teslas have 360 dashcams. Where are all the accidents?!
I didn’t say FSD was an LLM. My comment was implementation agnostic. My point was that drivers are less forgiving to what programmatically seems like a small error than someone who is trying to generate an essay.
Maybe so, but from where I stand the primary goal should be “Better driver than a human” which is an incredibly low bar. We are already quite a ways past that and its getting better with every release. FSD is today nearly 100% safe, most of the complaints now are around how it drives like a robot by obeying traffic laws, which confuses a lot of other drivers. There are still some edge cases yet to be ironed out extensively like really heavy rain, some icy conditions and snow. People are also terrible drivers in those conditions so its not a surprise. It will get there.
it has to perform flawlessly 99.999999% of the time. The number of 9s matters. Otherwise, you are paying some moron to kill you and perhaps other people.
ok so im totally in agreement but 99.999999% is one accident per hundred million miles traveled. I dont think there should be any reasonable expectation that such a technology can ever possibly get that far without real world testing. Which is precisely where we are now. Maybe at 4 or 5 9s currently.
If you do actually want to have that level of safety, which lets be honest we all do, or ideally 100% safety, how would you propose such a system be tested and deemed safe if not how it’s currently being done?
deleted by creator
Here is an alternative Piped link(s): https://piped.video/aqsiWCLJ1ms?si=D9hbZtbC-XwtxjpX
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
Your posts here show you’re not interested in reality, but I’ll leave a link anyway
https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-investigations/
Excited to see your response about how this is all user error.
I’m sure you’re just going to downvote this and move on without reading but I’m going to post it anyway for posterity.
First, a little about me. I am a software engineer by trade with expertise in cloud and AI technologies. I have been an FSD beta tester since late 2020 with tens of thousands of incident-free miles logged on it.
I’m familiar with all of these incidents. Its great that they’re in chronological order, that will be important later.
I need to set some context and history because it confuses many people when they refer to the capabilities of autopilot and FSD. Autopilot and FSD (Full Self-Driving) are not the same thing. FSD is a $12,000 option on top of any Tesla, and no Tesla built prior to 2016 has the hardware capability to run FSD.
The second historical point is that FSD did not have any public release until mid-2022, with some waves of earlier releases going to the safest drivers starting in mid-2021. Prior to that it was exclusive to Tesla employees and select few trusted beta testers in specific areas. Any of the issues in this article prior to mid-2021 are completely irrelevant to the topic.
Tesla’s autopilot system is an LKAS (Lane keep assist system). This is the same as is offered in Honda (Honda Sensing), Nissan (Pro Pilot Assist), Subaru, Cadillac, etc. Its capabilities are limited to keeping you in your lane (via a front-facing camera) and maintaining distance to the car in front of you (via radar, or cameras in later models). It does not automatically change lanes. It does not navigate for you. It does not make turns or take exits. It does not understand most road signs. Until 2020 it did not even know what a red light was. It is a glorified cruise control and has always been presented as such. Tesla has never advertised this as any sort of “hands-off” system where the driver does not need to pay attention. They do not allow the driver to lose attention from the road in FSD either, requiring hands-on the wheel and constant torque as well as eyes on the road (via an interior camera) in order to work. If you are caught not paying attention enough times the system will disengage and even kick you out of the program with enough violations.
OK, now that being said, lets dig in:
November 24, 2022: FSD malfunction causes 8-car pile-up on Bay Bridge
April 22, 2022: Model Y in “summon mode” tries to drive through a $2 million jet
February 8, 2022: FSD nearly takes out bicyclist as occupants brag about system’s safety
December 6, 2021: Tesla accused of faking 2016 Full Self Driving video
March 17, 2021: Tesla on Autopilot slams into stationary Michigan cop car
June 1, 2020: Tesla Model 3 on Autopilot crashes into overturned truck
March 1, 2019: NHTSA, NTSB investigating trio of fatal Tesla crashes
May 7, 2016: First known fatality involving Tesla’s Autopilot system
So, there we go. FSD has been out to the public for a few years now to a massive fleet of vehicles, driving collectively millions upon millions of miles and this is the best we’ve got in terms of a list showing how “Dangerous” it is? That is pretty remarkable.
Excited to see your response.
Interesting, you wrote an entire dissertation on why you think this is all a false flag about Full Self Driving, but it seems to be mostly anecdotal or what you think is happening. Being a “software by trade” isn’t enough to face the facts that something fishy is 100% going on with Tesla’s autopilot system.
“The last time NHTSA released information on fatalities connected to Autopilot, in June 2022, it only tied three deaths to the technology. Less than a year later, the most recent numbers suggest 17 fatalities, with 11 of them happening since May 2022. The Post notes that the increase in the number of crashes happened alongside a rapid expansion of Tesla’s “Full Self-Driving” software from around 12,000 vehicles to almost 400,000 in about a year”
https://www.caranddriver.com/news/a44185487/report-tesla-autopilot-crashes-since-2019/#
You claim the timeline is important here and this is all post-2022.
Tbh the other side is also anecdotal. There’s no stats here.
What’s fishy about it? You realize 40,000 people die every year from car accidents, meaning 110 die every single day, and you’re referencing 17 fatalities spread out over a few years as some big crisis. This tech (from any manufacturer) isn’t going to prevent 100% of accidents, and there’s not much you can do when drivers willingly drive their car into the side of a semi just like they did before this technology existed.
I won’t argue AP, FSD, or any other system doesn’t have it’s issues but most of these responses are overblown sensationalism.
I am not a “Software by trade” that was a typo. Believe it or not I wrote that entire thing on mobile.
Correlation does not equal causation. Tesla sold a huge number more vehicles in the past 2 years than ever before. Also in 2019,2020 and part of 2021 not a lot of people were driving due to the pandemic.
And, yes, a lot of the first incident I covered there was mostly anecdotal or what I think is happening. Importantly, what I think is happening as someone with years and tens of thousands of miles of experience using FSD beta. I do not have the facts and also importantly, neither do you. I am interested to see what comes out of that court case, but from where I sit I do not think FSD was involved at all.
Please let me know where I have misrepresented facts, I will either correct them or cite sources.
Again, Teslas come with a factory installed 360 dashcam. It records all the time. Where are all of the videos of these FSD related incidents?
deleted by creator
Here is an alternative Piped link(s): https://piped.video/watch?v=jvO9_zTTsPg
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source, check me out at GitHub.
deleted by creator
One of many many examples: https://www.businessinsider.com/tesla-stops-tunnel-pileup-accidents-driver-says-fsd-enabled-video-2023-1?international=true
Tesla has a huge problem with phantom breaks.
See my huge post about that very accident right below. Do you have any other “Many many examples”?
Here is more: https://www.motortrend.com/news/tesla-fsd-autopilot-crashes-investigations/
How many do you want?
ffs that is the exact same article again. Please read my other comment (the huge one) let me know if anything doesn’t make sense or you find anything factually inaccurate.
deleted by creator