The March 23rd death of a Tesla Model X driver in Mountain View, California prompted the National Transportation Safety Board to probe why the vehicle, driving in Autopilot mode, left its lane and collided with a concrete lane barrier on a clear day. The impact killed 38-year-old Walter Huang, an Apple engineer.
In the wake of the crash, the safety agency booted Tesla from the investigation after the automaker released details relating to the vehicle’s (and victim’s) actions in the moments leading to the crash. We now have the NTSB’s preliminary report on what happened before, during, and after the collision.
As we detailed at the time, Huang’s Tesla was travelling southbound on US-101 in the high-occupancy vehicle (HOV) lane, approaching the State Highway 85 interchange. The exit lane was to Huang’s left approaching the split, and a paved gore area opens up between the two lanes as the SH-85 ramp branches off. According to the NTSB, Huang’s cruise control was set at 75 mph on the 65 mph roadway, with Autopilot functions (traffic-aware cruise control and autosteer lane-keeping) turned on four separate times during the 32-minute trip.
Autopilot was engaged for the last 18 minutes, 55 seconds of the journey.
“As the Tesla approached the paved gore area dividing the main travel lanes of US-101 from the SH-85 exit ramp, it moved to the left and entered the gore area,” the preliminary report states. “The Tesla continued traveling through the gore area and struck a previously damaged crash attenuator at a speed of about 71 mph.”
Here’s the NTSB’s breakdown of what occured before impact:
- The Autopilot system was engaged on four separate occasions during the 32-minute trip, including a continuous operation for the last 18 minutes 55 seconds prior to the crash.
- During the 18-minute 55-second segment, the vehicle provided two visual alerts and one auditory alert for the driver to place his hands on the steering wheel. These alerts were made more than 15 minutes prior to the crash.
- During the 60 seconds prior to the crash, the driver’s hands were detected on the steering wheel on three separate occasions, for a total of 34 seconds; for the last 6 seconds prior to the crash, the vehicle did not detect the driver’s hands on the steering wheel.
- At 8 seconds prior to the crash, the Tesla was following a lead vehicle and was traveling about 65 mph.
- At 7 seconds prior to the crash, the Tesla began a left steering movement while following a lead vehicle.
- At 4 seconds prior to the crash, the Tesla was no longer following a lead vehicle.
- At 3 seconds prior to the crash and up to the time of impact with the crash attenuator, the Tesla’s speed increased from 62 to 70.8 mph, with no precrash braking or evasive steering movement detected.
The NTSB’s finding that Huang did not have his hands on the wheel for the final six seconds jibes with what Tesla released shortly after the crash. The same goes for the audio and visual warnings.
As this is just a preliminary report, the NTSB isn’t saying why the Tesla’s autosteer moved the vehicle from the HOV lane, where it had slowed to match the speed of the lead vehicle. There’s video evidence, posted to YouTube from another Tesla driver, that suggests the poor condition of the solid white line separating the HOV lane from the gore area may have confused the vehicle’s lane-keeping system.
Once the Tesla moved into the gore area and away from the leading vehicle, it seems the vehicle’s traffic-aware cruise control attempted to accelerate the vehicle back up to its preset speed.
While bystanders removed Huang from the wreckage before his vehicle was consumed by flames, the car’s lithium-ion battery proved troublesome for firefighters long after the initial blaze was doused. Such batteries are highly volatile when breached, and extinguishing them often proves difficult.
Per the NTSB report, “Around 4:30 p.m. that afternoon, at the impound lot, the Tesla battery emanated smoke and audible venting. The battery was monitored with a thermal imaging camera, but no active fire operations were conducted. On March 28, 5 days after the crash, the battery reignited. The San Mateo Fire Department responded and extinguished the fire.”
That’s all we have to go on for now. “All aspects of the crash remain under investigation as the NTSB determines the probable cause,” the agency wrote, “with the intent of issuing safety recommendations to prevent similar crashes.”
[Image: Screencap, KGO-TV]

I believe the problem is more conceptual than technical.
To create a system that is broadly understood to be 95% effective creates a false sense of security for the human operator. The chance that the 5% of faulty operation will occur when the human driver is not engaged is simply too likely.
Hard to argue with the certain reality of the “false sense of security” of this only partial autofunction. Although – it may be possible that actual accident rates are still somewhat lower for this, than with typical texting/drinking/roadraging, etc. human drivers at the wheel 100% of the time.
I was riding my bike this morning when I spotted Tesla behind me. I was freaking out, made evasive maneuver and put few cars in between us.
For this guy, it is better than he is gone. Now he will no longer able to kill anyone else with him.
“freaking out”, “evasive maneuver”? Good lord, would not a sane, safe lane change have been better? Difficult to see any advantage to hair-raising evasive action – as if Terminator was after you.
Vehic1,
no shame in being layman. Here is quote straight from motorcycle operator manual. Term “evasive” is used in Motorcycle driver manuals all over US.
“Provide a space cushion around the motorcycle that permits you to take evasive action”
https://www.msf-usa.org/downloads/mom_v16_color_hi_res.pdf
And who said that my “evasive maneuver” was not safe?
“with the intent of issuing safety recommendations to prevent similar crashes.”
Here are the safety recommendations:
1. To drivers: If you expect a Level 2 AV to behave like a Level 5 AV, you (and others) could die, and it will be your fault. So remain attentive.
2. To NHTSA: See to it that Levels 2 and 3 autonomy are banned.
+1
+1 and really point #2 you made needs to be law tomorrow. Unless it’s level 5, get it off the streets!!
You want to ban a technology that has proven able to reduce accidents and fatalities by 40%?
Where was that “proven”?
The “40%” claim comes solely from unreleased Tesla data and even then is only due to the Autosteer part of the system. The NHTSA has not made any determination on safety gains related to Tesla’s Autopilot or Autosteer.
autoblog.com/2018/05/03/nhtsa-disputes-tesla-safety-claim-autopilot/
insideevs.com/tesla-told-to-improve-autopilot-release-claimed-worlds-safest-data/
“A U.S. traffic safety regulator on Wednesday contradicted Tesla’s claim that the agency had found that its Autopilot technology significantly reduced crashes, saying that regulators “did not assess” the system’s effectiveness in a 2017 report.”
That’s a far cry from disproved. Are you really staking your position on that autoblog post?
Who said anything about “disproved”?
My “position” is that the 40% reduction line you keep parroting isn’t proven at all. It is a claim being made by Tesla based on data that hasn’t been publically released or verified by any outside sources.
So your basing you’re opinion on what statistical facts? Surely you’re not basing it on a few hysterical media reports…right?
The only opinion I have on this is that we should wait for independent verification before declaring Tesla’s 40% claim “proven”.
So…idiotic incidents like this are the price we have to pay for this (unproven) benefit?
Gotta break some eggs to make an omelet, right?
@jmo
please cite a non-Tesla source for the 40% reduction figure.
Thank you
These deaths won’t really resonate with the public until a child, or seemingly more important in today’s America…someone’s dog, is killed by one of these four-wheeled drones.
The data shows these cars are less likely to run over kids and dogs than the current batch of human drivers. What logic are you using to formulate your position on this technology?
The fact that a pedestrian was already slaughtered and it’s only a matter of time before it happens again. Nobody has asked for this risk, it’s been foisted upon them by manufacturers and politicians looking to claim that they’re on the vanguard of technology.
What risk? The risk is less than some guy who is too busy texting runs you down.
At least the guy/gal texting can be prosecuted and held accountable. A myriad of sensors provided by a range of manufacturers all controlled by software written by who knows will just end up in a lawyer’s wet dream.
“What risk? The risk is less than some guy who is too busy texting runs you down.”
Fine. The texting guy that runs down a person will get banned from driving for 5 years.
How about ban Tesla cars from driving on public roads for 5 years for the errors made by its system?
You don’t need anything more than Volvo’s city safety to accomplish this.
Jmo, here is my data. Tesla crashes into school bus and kills 5 kids… You see them all taken off the road. Autopilot is not protected by 2nd amendment.
They resonate in terms of mostly people don’t like to hear of people being killed. The issue is 40,000 auto deaths per year in the US, and how do you reduce them?
Well, for starters, you can start paying attention to the road.
Unfortunately, Autopilot encourages people to do the exact opposite.
And since most folks are too busy texting as it is there is no safety impact as far as we can tell. Indeed safety improves.
Yes, folks are too busy texting…which Autopilot makes even easier.
@probert:
The problem with solving a social problem (40k road deaths) is that nobody wants to be the one to pay the price for it, unless every one does. Catalytic converters reduce pollution, and everyone pays.
But no family is willing to give up *their* loved one for the cause of reduced road deaths. Besides, everyone thinks they have total control over their fate on the road.
$5,000 per driver’s license with no subsidies for anyone.
Mandatory jail time of one year or more for driving without a license.
That should take enough people off the road to reduce deaths substantially.
And people derisively call Tesla the “rich man’s toy”.
With your rule in place, this car would still have crashed like this.
Well that would preclude my daughter form getting a job to pay for college.
When I was a kid, like many of my Baby Boomer brethren, I wanted to be a test pilot. Chuck Yeager was a real badass. Now I don’t want to be a test pilot, which is what we’ve all essentially become, at least from a risk standpoint. Public roadways are not the place to test these experimental vehicles. Why doesn’t Musk build a test city, replete with functioning traffic lights and robotic pedestrians? He could test his cars and his fanbois could brag about the city he built, another testimonial in their attempt to beatify their Idol, Elonius Maximus.
I wish Tesla kills some important mafia guy. Mafia whacks Musk and some of his friends by running an excavator over his vehicle and this is going to be a message to the rest of them. In fact, this should be a new movie scenario based on true story.
Sounds like a pitch for a Colombo TV movie.
Colombo: “Well… I’m sure Mr. Musk felt the same way until the excavator hit him… Say you got a light? I seem to have left mine in the car.”
These must be tying times for the Musk shills, who really only signed on to be suicidally stupid in front of people who were paying attention.
“On March 28, 5 days after the crash, the battery reignited. The San Mateo Fire Department responded and extinguished the fire.”
This could be the single greatest sentence ever written. The con man must be seething at that. Why anyone would put one of his POS and dangerous appliances in their garage is beyond me.
All Teslas should be taken off the road. They are a menace to society.
Can’t legislate stupidity away. If the problem is how it is marketed and advertised to the rich masses, hammer Tesla with truth in advertising laws. If it so bad, why are they allow to sell them, why don’t insurance companies refuse to insure them or put almighty rates on them. It’s not the technology, it’s the blind faith some human being have.
I think the Cadillac style system could help with the texting issue. When the driver takes their eyes off the road it cuts the throttle (not the engine) so the car can then pull safely off the road. Each time it double the amount of time before you get back on the road. 1st time, 30 second, 2nd time a minute, 3rd time 2 minutes. You get the idea.
Perhaps the guy was having a heart attack or a seizure. Makes sense – foot floors accelerator while the hands come of the wheel.
I would never let a computer take control of my steering wheel at 60+ MPH near these types of hazards.
An automaker who cannot build quality products in a timely fashion and within two years of its launch goals being announced should be the LAST company you should trust with your life. Evidently they cannot even release a vehicle that can stop much better than a 1950’s Nash.
The problem may be with Tesla, but their self-important owners are thankfully being culled because they know better than anyone else.
I don’t see a problem. If Tesla kills its stupid buying base, pretty soon no one will be left to buy one of their disasters.