When is an accident not just an accident? When it involves a Tesla, according to Elon Musk. The electric automaker’s CEO took to Twitter to lambaste the media Monday night for reporting on the high-speed collision between a Tesla Model S and a stopped fire truck in Utah last Friday.
It’s true, a collision resulting in minor injuries usually only warrants a brief mention in local media, if that. However, context is key. When it’s revealed that Tesla’s semi-autonomous Autopilot system was activated at the time of the collision, sorry, that’s news.
On Monday, police in South Jordan, Utah said the Model S had been under the control of Autopilot when it collided with the rear of a fire truck at a red light. We say “under control,” as the 28-year-old driver claims she was looking at her phone prior to the time of the impact.
The Model S collided with the truck at a speed of 60 mph. According to media reports, the crash occurred during daylight hours, with light rain falling. The driver was treated for a broken foot, while the occupants of the truck emerged unscathed.

Naturally, speculation arose shortly after the crash as to whether Autopilot was involved, and given recent incidents, including two fatalities on U.S. highways (plus one in China), it’s not unwarranted. When the driver revealed she had been using (and misusing) Autopilot, the attention rightly focuses on why the car’s semi-autonomous system did not attempt to avoid the collision. Witnesses claim the vehicle didn’t brake prior to the impact.
“It’s super messed up that a Tesla crash resulting in a broken ankle is front page news and the ~40,000 people who died in US auto accidents alone in past year get almost no coverage,” tweeted Musk.
The CEO quickly added, “What’s actually amazing about this accident is that a Model S hit a fire truck at 60mph and the driver only broke an ankle. An impact at that speed usually results in severe injury or death.”
Many modern carmakers would disagree, as brakes and airbags and crumple zones and high-strength steel and every other safety aid in existence is also accessible to other companies. In a Twitter exchange with Techmeme, Musk responded to a posted report that claimed he rejected the use of eye-tracking technology (used by Cadillac’s Super Cruise system to monitor driver awareness) in the interest of cost savings. Musk claimed he rejected the technology because it is ineffective.
“According to NHTSA, there was an automotive fatality every 86M miles in 2017 (~40,000 deaths),” he stated. “Tesla was every 320M miles. It’s not possible to be zero, but probability of fatality is much lower in a Tesla. We will be reporting updated safety numbers after each quarter.”
As Bozi Tatarevic quickly pointed out, the average age of a vehicle on U.S. roads is 11.6 years old, which skews the stats further in Musk’s favor. Also, those NHTSA figures seem to include fatal collisions involving motorcycles.
Still, Tesla aficionados (to use a polite term) quickly rushed to Musk’s defense, both before and after the revelation of Autopilot involvement in the Utah crash.
It’s true that the company, after touting the self-driving capabilities of its Autopilot system during its infancy, has taken a more cautious tack in recent years. The company warns drivers to remain aware, to keep their hands on the wheel, and to be ready to respond at any given moment. Under these guidelines, the Utah driver was indeed driving in an unsafe manner. We don’t know for how long her attention was diverted, nor what warnings she may have received from the vehicle.
But the question remains — why didn’t the car’s cameras and radar react to the approaching truck and activate the car’s automatic emergency braking system?
[Sources: NBC, Washington Post] [Images: Tesla, South Jordan Police Department via Associated Press]

Who cares that the car didn’t see the fire truck? THE DRIVER DIDN’T SEE THE TRUCK BECAUSE SHE WAS ON HER STUPID PHONE! Lock her up for chrissakes. Teach these idiots a lesson.
We should care very much since noticing things like firetrucks, concrete walls, semis, etc. is very much within what even far more rudimentary systems accomplish with ease. Discovering why it’s not happening here is not an irrelevant question.
Your statement is like saying we shouldn’t worry about why an airbag failed to go off since the driver should have avoided an accident to begin with.
Not what I meant. If she wasn’t on her phone, she wouldn’t have needed the auto brake system that obviously didn’t engage.
But she can be on her phone. The system is called autopilot for a reason.
I REALLY hope you’re joking.
More accurately, any system labelled “Autopilot” will almost certainly lull drivers into a false sense of security that makes them believe they can be on their phones or otherwise distracted from the task of driving.
“I REALLY hope you’re joking.”
To a point. But the system is named autopilot. Is it unreasonable to speculate that people will take that as the car drives itself?
Accident would have happened whether or not it is called “Autopilot” or something less benign like “Cruise Control +”
“ZURICH (Reuters) – Swiss firefighters said on Monday that the impact in a FATAL accident involving a Tesla electric car may have SET OFF A FIRE IN THE VEHICLE’S BATTERY.”
http://www.reuters.com/article/us-swiss-tesla-crash/tesla-crash-may-have-triggered-battery-fire-swiss-firefighters-idUSKCN1IF2WN
Being cremated in one’s Tesla saves on burial and funeral expenses.
Because gasoline cars never, ever catch fire in a collision. Or even without one.
Hey, leave Yugo out of this! I kid, I kid. they were known to catch fire at random while driving them though.
Oh, and the girl should have her license suspended for a year for distracted driving resulting in an accident. Doesn’t matter if it is autopilot or Volvo’s tech or all of the smarter cruise control options now. The driver is still responsible. Just like the pilots of AF447.
People are forgetting how many Fords burned their owner’s homes down. People are forgetting how many just lit up during the course of driving down the road.
ICEV fires are NOT uncommon and they certainly don’t need a crash to light up.
Reminds of the crash we had on US70 years ago. Guy in a Conversion Van is cruising on cruise control, gets up from behind the wheel and walks toward the back of the Van to get a drink and a snack.
Van gradually veers of the road and into the desert. Guy gets jostled and bruised but learned a valuable lesson when the Van finally gets stopped by a gulch.
Reminds me of that classic WKRP episode…”God as my witness, I thought turkeys could fly.”
youtube.com/watch?v=lf3mgmEdfwg
Yes, I remember! That was a really funny episode from a long, long time ago, in a TV-universe far, far away.
GE and Westinghouse made the best fans in the world back in the day. Hell I’ve got a couple of antique fans in the basement wondering who wants them for their collection. Then the realities of manufacturing moved fan production to Asia. Dyson tried to stem the tide but in numbers has failed.
Musk wanted to change all that – bringing the best and brightest fan title back to the US. And so far he’s succeeding. His Internet Fans make more noise than Dyson ever quieted. Maybe he can hire Dyson to get him out of this mess. But I’m betting on a golden parachute from India or China, scooping up his bankrupt enterprise after the death crash.
“We say “under control,” as the 28-year-old driver claims she was looking at her phone prior to the time of the impact.”
Um like the car was supposed to drive itself. Like I was supposed to pay attention? As if!
“Um like the car was supposed to drive itself.”
It was on “autopilot”.
Which is 100% of the argument that a class action lawsuit for misleading and dangerous advertising should be filed against Tesla.
People should drive their cars, and are responsible. However there is a parallel and separate problem when Tesla’s marketing videos and claims all say “Autopilot”, “autonomous” when it isn’t even fully Level 2 certified as adaptive cruise control.
Its obviously a result of saying in one breath “autopilot” and in another “Drive attentively with hands on the wheel.” Which is why I don’t get why a suit hasn’t already been filed.
“Which is why I don’t get why a suit hasn’t already been filed.”
Probably because of the terms and conditions a user of the system has to agree to.
“If y’all crash and stuff y’all can’t sue Tesla LLC b/c Autopilot is not Auto or Pilot 4u. Click OK tho.”
-lawyers
Yep, and Elon Musk is happy as hell he has them right now.
With Level 2 autonomy, no lawyers are needed. The user presses a button agreeing to remain attentive, so they own it.
Level 2 should be banned.
Legally, they may be off the hook, but the morality of all this is a different story.
I don’t know how you live with selling a feature called “autopilot” that drives the car by itself, and then throw up your hands and say, “hey, I told you so” when drivers get into trouble. It’s like tobacco companies selling a toxic product, and then saying, “hey, you knew it was dangerous…do you want some more”?
Not my point, they advertise it as autonomous.
Its a truth in advertising and deceptive trade practice suit not a legal ownership of fault in system case.
Recall the “Red Bull gives you wings” settlement? One of those kind of suits.
Initially, airplane autopilot was level 2. Technically, airplane autopilot can still be merely a level 2. Tesla is level 2.
Turns out this specific car may have had its emergency braking system operate. It seems strange how level the car is sitting and the bumper is un-touched. This could mean a nose-dive as it tried to brake before hitting the truck, at which point the actual impact speed may have been less than half the original reported speed.
Again, not enough data to state anything for certain.
Uh, no. Ever look at the back of a fire truck? Their rear bumpers are most certainly NOT at the same level as they are on the average automobile. They’re MUCH higher, hence the underriding of the Tesla’s front bumper. My bet is the auto braking feature did nothing.
Then explain why the car stopped at the front of the windshield, considering that there was almost no effective mass available to stop the car before beheading the driver.
And don’t go blaming the truck’s axle… again, there’s no damage to the bumper of the car. Had it hit the axle, there’d be obvious damage.
This latest crash happened about 3 miles from my house. The takeaway from this crash should also include the driver only has a broken ankle from 60 mph crash into a parked truck. Do that in a Toyota 4Runner or Corolla and that driver would probably have a broken pelvis and probable head injury.
If they were in a Volvo 240 the fire truck would have a volvo shaped hole through the middle.
Very well made vehicles the 240s. I owned one and had a run in with a 1980’s vintage Oldsmobile 98. The 240 shaped hole is sort of what happened.
Well, more like the engine in the Olds became a front passenger seat occupant and I drove my 240 home after the crash, bought some new tail lights and drove it for another year without any other repair.
I’m convinced volvo was supplied by aliens in the 80’s. Only explanation for how tough the steel is on those things.
I remember the commercial where they drove off the second (or third) floor of a parking ramp.
Any modern sedan in the Tesla’s price class provides at least as much crash survivability as a Tesla S does. Have you looked at crash test data for a Mercedes S-class lately?
S-class is more like a tank.
Have you looked at the crash test and pics of the 2018 4runner. Door frame didn’t even hold up. Dash is practically in the front seat.
http://www.iihs.org/iihs/ratings/vehicle/v/toyota/4runner-4-door-suv/2018
We are only going on the word of the driver. They haven’t fully examined the car yet. If it was the autopilot system that failed people should get used to it. Computer software and hardware are hardly failure proof. The more complex the system the higher the rate of failure.
I still use a slide phone and that thing works right like maybe 20% of the time. Electronics, as cool as they are, sure aren’t always reliable.
“why didn’t the car’s cameras and radar react to the approaching truck and activate the car’s automatic emergency braking system?”
Why didn’t autopilot see the truck and apply the brakes?
There. I just saved an internet tree.
Even better question: why didn’t the driver of the vehicle realize that her car really can’t drive itself, and pay attention?
Because she’s an idiot. Plain and simple.
Of course she’s an idiot…and but the other piece of this that she was sold on the idea that she could BE an idiot and get away with it.
I’d say it takes two to tango idiotically here.
That dance looks like a dry heave set to music.
I was actually just rewriting an overly wordy/complicated sentence.
But, yeah. Why with all that has been in the news would anyone let the car drive for them while they tended to other matters is beyond me.
Automatic emergency braking is something different from Autopilot. It is supposed to be active at all time and send the car into an emergency stop when an imminent collision is detected. This is the same feature you now get on Civics and Imprezas.
The emergency braking system not seeing a fire truck is a massive level of failure. It may not have been able to avoid a crash at 60 MPH vs a stationary truck but it should have applied the brakes and mitigated the impact severity.
“The emergency braking system not seeing a fire truck is a massive level of failure. It may not have been able to avoid a crash at 60 MPH vs a stationary truck but it should have applied the brakes and mitigated the impact severity.”
— If you look at the car, it seems to have done exactly that. Please note that the bumper is not deformed and the car is sitting on the level. At 60mph it would have taken much, much more damage and the ‘driver’ might have been decapitated.
Because she was told she didn’t have to. By Tesla.
Maybe she was travelling at 100mph and autobraking did engage and got her to 60. Or maybe the impact was at 20mph, not 60 as written, due to autobraking engaging. Facts from data recorder will be nice to have. Doesn’t look like 60mph to me – looks like ‘brake diving’.
Thumbs up. The fact that the car’s bumper is un-touched and the bumper step of most fire trucks are at automotive bumper levels (which is where ALL truck bumpers belong) strongly suggests brake diving.
I was next to a fire truck this afternoon. The back bumper is about knee high. There was a helper step that was lower but very flimsy.
I think the bumper under-rode the firetruck and substantial parts of the chassis like the strut towers are what stopped that Tesla.
It seems pretty clear to me that either truck bumpers are too high or car bumpers are too low. As I recall, there is a Federal mandate that truck and car bumpers all be at the same height and since it’s difficult to raise a car’s bumper, truck bumpers need to be lowered. Yes?
And while we’re at it,“why didn’t the car’s cameras and radar react to the approaching truck”.The Tesla was doing the approaching, the fire truck was parked:p
Musk always wants it both ways.
1) Tesla technology avoids a collision: Thank you Tesla.
2) Tesla technology crashes a car: Driver error.
This.
This ?
With Level 2 autonomy, it *is* both ways. It doesn’t actually have to work.
Weird. Con man Musk is trying to defend his “AUTO pilot” system again after a colossal failure.
Tesla needs to be shut down. They build low quality garbage.
Don’t hold back; tell us how you really feel.
I have to hold back. I can’t use the language I would like to describe Musk.
What, the guy dumped your sister or something?
Seriously, I don’t get the hate.
My feelings on “autonomous driving” and its’ limitations are well documented and I won’t restate them here.
But at some point, you have to wonder how many mishaps the users of these systems have to be exposed to before they learn you can’t just just screw around on your phone while you’re driving. Will it take some idiot ramming a school bus full of kids, and doing time and/or getting sued for millions of dollars, to make people finally pay attention?
Far as Musk is concerned, I’d say it’s time for him to stop trying to deflect blame, and start imploring the people who drive his vehicles to pay attention to the f*cking road.
You say your feelings are documented. Is there a link you could post?
Take a look at the other posts on Tesla autopilot on this site. Safe to say I’m not sold on the tech.
@FreedMike: Looks like the crappy AP1 single camera system. I’m just not a fan of single camera. I’m even using multiples/stereo on my FLIR sensors.
If she really hit the truck at 60 mph, she benefited from the fact that the Tesla underrode the truck’s rear end, reducing the deceleration of the occupant compartment to survivable levels.
I don’t think there’s a vehicle sold today that could protect a driver or front passenger from death in a 60 mph head-on crash into a solid barrier. (And no, not the old Volvo 240 either.)
Sadly, there was a fatal accident just down the highway from me the other day where a car crossed the line and impacted an SUV in the other direction. Speeds were probably 55-60+. A tale of two vehicles:
“According to Aaron, the investigation shows that Burgin was driving a 2009 Hyundai Accent northwest on Highway 96 at 4:20 a.m. when, for reasons undetermined, the car crossed over the double yellow line and collided with an oncoming 2017 Toyota Rav4 SUV. The Hyundai rolled over and caught fire. Burgin died at the scene.
“The driver of the Toyota, Ashley Lewis, 26, of Fairview, was seriously injured and was taken to Vanderbilt University Medical Center for treatment.”
https://franklinhomepage.com/victim-identified-in-friday-pre-dawn-crash-north-of-the-natchez-trace-bridge/
That’s the classic big vs. little collision, minus the post-crash rollover and fire. Little almost always loses. The RAV4 driver didn’t experience a 55-60 mph delta V, probably more like 35-40 mph considering how tiny an Accent of that generation was.
Agreed.
The posted speed might be near 60, but that’s not a 60 mph collision. I’d say maybe 25, since the upper layer of the hood took all the impact while the car slid underneath.
Remember this- the cost of progress is always a few human lives here and there. If it’s yours.. oops! sorry! (well.. not really sorry).. if it’s theirs, then something might be done. This whole autopilot thing is literally.. outta control!
I’m no fan of government intervention, but I actually think they need to order the disablement of this feature. The combination of the technology not being quite ready, infrastructure designed (and neglected in terms of maintenance, etc) for human drivers and not autopilot, and the lack of driver education around what it is and what it isn’t. I know he wants to change the world and all, but the risk is too high to be selling these things commercially. You need roads made for it, better tech, and specific driver training paired with an education on liability. None of which has happened. The lowest common denominator unfortunately applies here – if the dumbest guy on the road can’t handle it, then it’s not ready.
Whether you argue for government intervention or against it, there’s no arguing that when a company makes false promises, and in so doing endangers the public, the government has every reason to intervene.
I’m hoping they do so before some idiot rams himself into a bus loaded with kids.
Agreed on this, but not just for Tesla. Level 2 and 3 autonomy should be banned.
“Being cremated in one’s Tesla saves on burial and funeral expenses.”
And with 7 Fatalities in 14 Fires you indeed have a 50% chance of that happening to you. In ICE cars that number is…wait 0.25% chance.
PT Barnum-Musk just loves stats.
Oh, yeah – we need to go back to human-controlled horses-and-buggies ASAP! With humans doing all the driving, everyone’s safety is guaranteed – what could go wrong?
Yeah, unlimited range too. Just stop at the side of the road and let the horse(s) chow down. No range anxiety. Also, level 5 autonomous capability built right in – if you’re headed home.
That would be level 4. :)
You know, he turns into like a teen girl sometimes on Twitter or whatever.
“That’s super messed up!”
On Wednesdays we wear pink.
“But the question remains — why didn’t the car’s cameras and radar react to the approaching truck and activate the car’s automatic emergency braking system?”
— Honestly, that’s the only question here that should involve Tesla. The driver clearly admitted she was at fault for Distracted Driving.
Isn’t this the second instance of a Tesla on autopilot ramming into the back of a fire truck specifically? Something similar happened here in LA on the 405 recently. It’s like Autopilot is blind to big red fire trucks.
There are TONS of Teslas here in LA. Not gonna lie, whether it’s justified or not, I’m extra, EXTRA cautious around all Teslas on the freeway, especially in heavy-but-moving-at-full-speed traffic.
It seems a bit irresponsible for the car to be able to be operated in the autopilot mode when traveling 60 mph through a traffic light controlled intersection. If autopilot can’t reliably detect fire trucks and tractor-trailer rigs in its path, it shouldn’t be able to be operated on non-restricted access roads at all.
Q: “…the attention rightly focuses on why the car’s semi-autonomous system did not attempt to avoid the collision”
A: It should have, but it doesn’t have to, since it’s only a Level 2 system.
If mfrs are so eager to push autonomy on us, they should be focusing on only Level 4 and 5, while the government bans Levels 2 and 3 from the roads.
Then, when a Level 4 or 5 car crashes into a firetruck, deer, or a pedestrian, only the mfr is to blame. Unless another human driver hits you.
“Then, when a Level 4 or 5 car crashes into a firetruck, deer, or a pedestrian, only the mfr is to blame.”
I think you just hit on the reason why level 4 and 5 isn’t what the purveyors of this tech want – it puts them on the hook.
Would a new Volvo run into a stopped fire truck at 60? I would assume its autonomous braking system slam on the brakes, and at least reduce the impact speed. Does Tesla not have anything like this in their cars? This is the second recent accident where a Tesla rammed something stationary, without slowing down.
It has a similar feature, at least on paper. In reality it appears to not function very well.
That ought to be where Tesla is focusing its attention. Doesn’t matter how far the car can go, how fast the car can go – or really how well it steers (just tell the driver to assume control b/c the computer can’t cope).
The automatic brakes ought to be the best in the business before the autonomous steering is advertised.
Counting the Swiss accident, two more lives lost, for a total of five.
The Traffic Sign Recognition system on a new Accord I test drove less than 2 weeks ago failed to “see” two speed limit sign changes in a row, as I accelerated from the city onto a four laner. It had the 50 km/h from the city, but missed the change to 70 then to 90 before finally showing the 100 when it changed for the last time. Mentioned it to salesman well before that while in the 90 km/h zone and showed him the displayed 50. Shrugs, “Lots of traffic, maybe they blocked the view.”
I was familiar with the changing speed regime as I travel the roadway frequently. No doubt the radar cruise control, it it had been engaged, would draw one speed limit input control from the TSR system, and you’d have perhaps had a real loiterer among the sharks heading out of town.
Interesting feature I had not heard of.
So where’s the public outrage? So far these crash test dummies have only done damage to themselves plus other’s property. Teslas have it in for blunt objects, but once unmanned Teslas turn on innocents, that’ll change.
I don’t understand why crashing a while on “Autopilot” (and it’s clearly your fault) isn’t considered a crime equal to a DUI. Isn’t it just as irresponsible and reckless?
Good question, but I’d put this on the same level as playing on your phone – a pretty serious ticket, but not DUI serious.
A “drunk driver” can go to jail (charged up to a felony) for their danger/harm to the public, but an Autopilot driver sending a totally unmanned car down the road is just a simple infraction at worst?
A texting/crashing driver (sober, no Autopilot, autonomy, super-cruise etc) can argue it was a momentary laps in judgement, no different than checking out a Hot Babe for too long, or Supercar, fumbling with a CD or drink/coffee etc, and never climbed into the car, hit the start button with malice or mayhem in mind.
If an Autopilot user is unaware of the dangers of totally “checking out” while driving, that’s not a true defense.
Distracted driving charges are gaining teeth.
I’m no attorney, but I’d say there’s a big difference between not paying attention as much as you should, and drinking so much that you’re literally unfit to drive.
But, yeah, I’d like to see stronger penalties for distracted driving – not DUI-level, but stronger.
Jerk their license for 30 days. That’ll get their attention better than some fine. Better yet, impound the vehicle for 30 days.
Sure, that’ll leave a bunch of formerly distracted people in a bind but then, that’s the idea.
You may as well send a large missile down the highway. Same danger, same results during an impact.
Does anyone know if this was the old single camera system or year of the car? I’ve been looking at the photos, but can’t figure it out one way or another.
With the older version of “autopilot”, I wouldn’t be surprised at the collision. I’m curious to see if it’s the new version.
Taking a closer look at the images, I’m thinking version 1, but I could be wrong. Not a fan of single camera systems.
“The driver was treated for a broken foot, while the paint on the truck remained unscathed.”
Fixed that for you.