There’s a study you should read, and it delivers black eyes to both Tesla and the National Highway Traffic Safety Administration.
You probably remember the fatal crash of a Tesla in Mountain View, California last March, a crash that occurred as the victim’s car cruised along in Autopilot mode. Unexpectedly, the vehicle steered itself out of a lane, impacting a highway divider at high speed. Once again, the effectiveness and safety of Tesla’s Autopilot system came under scrutiny as Tesla scrambled to defend itself. The automaker pointed to the findings of a 2017 NHTSA report released in the wake of a fatal crash from 2016. That study claimed the automaker’s Autosteer system, when introduced as part of the Autopilot suite of automated features, lowered Tesla crash rates by 40 percent.
Don’t believe everything you read, says R.A. Whitfield, director of Quality Control Systems. Whitfield filed a lawsuit and waited nearly two years to get to the bottom of that 40 percent figure.
As the NHTSA didn’t release the dataset behind the study, Whitfield requested it.
“Extraordinary claims ought to be backed by extraordinary evidence,” he told the Los Angeles Times.
Rebuffed, he filed a lawsuit under the Freedom of Information Act against the U.S. Department of Transportation. The data came into his possession in late November of last year.
After looking at the data, Whitfield discovered a serious problem with the methodology behind the controversial 40 percent figure. It has to do with the miles driven before and after Autosteer installation in data provided to the NHTSA by Tesla. The data covers a period from 2014 to 2016.
It’s a long document (you can read it here), but the report issued this month by Quality Control Systems breaks down the problems in the federal road safety agency’s calculations. Information about the number of pre-Autosteer miles traveled by certain vehicles in the data pool is missing, with other vehicles carrying vague info about when exactly Autosteer came online.
The NHTSA’s determination, the Maryland firm claims, was made “by examining the sums of the miles driven prior to Autosteer activation, miles driven after Autosteer activation, airbag deployment events prior to Autosteer activation and airbag deployment events after Autosteer activation for all of the subject vehicles.” That’s a quote from an NHTSA investigator.
In the report’s preamble, the firm states that, based on the incomplete data, it “recognized that NHTSA’s summarization of ‘miles driven prior to Autosteer activation [and] miles driven after Autosteer activation’ might not actually include all of the miles driven before or after Autosteer activation.”
After breaking the data down into different groups of vehicles (based on quality of reported mileage), the firm slammed the NHTSA. “The Agency’s treatment of missing or unreported mileage data in its calculations of exposure mileage as though the mileage were non-existent is not justifiable,” it says in its report.
The vast number of vehicles with missing mileage “results in the inflation of the overall ‘before Autosteer’ airbag deployment crash rate reported by NHTSA, but to a degree that can’t be known with certainty,” the report concludes.
In other words, the 40 percent crash reduction figure doesn’t have a leg to stand on. Exactly by how much Autosteer increased or reduced the crash rate can’t be known, given the incomplete data. The report concludes with a warning for lax agencies and automakers who hope to coax the public into supposedly safe self-driving vehicles.
“A very substantial fraction of the public simply doesn’t trust autonomous driving
technologies. Given the scarcity of scientifically reliable, publicly available
data about the safety of these systems, why should they?”
[Image: Tesla]

You must bow at the altar of Elon! Blastphemer!
We need to build a loooong wall. And then put some people against it.
I suspect the end for Tesla will come at the hands of tort attorneys. The Elon Musk hubris won’t play well to juries.
Not until they claim Autopilot is Level 4 or 5 autonomy.
Until then, Tesla’s Level 2 system doesn’t even have to work because the driver accepts all responsibility for its engagement.
Personally, I don’t trust or want a self-driving option in my car.
Radar cruise and lane keep assist are both nice features. They’re honestly less necessary than they can start to seem once you have a car with them for awhile (kind of like all new features of cars), but for someone who drives 16 hour days, with half of those hours spent with ones head stuck in a cheese puff bag, both of those are very nice to have.
I read the original report by these guys… It looked to me like they cherry-picked the data, since they only ACCEPTED data from a total of 5,7xx cars and not the entire block.
Exactly. I thought the same thing. My question is why did this guy decide to go through all of that effort and expense to get that data? What was the motivation? The company seems to consist of just one guy. Who paid for the legal expenses? Follow the money.
You guys remind me of a Trumpcore comments page when some outlet lays bare another of his broken promises.
There are rumors that he was paid to do it. Now, whether or not that is true or by whom we may never know, but it does seems strange that he would single out a mere 5700 or so out of some hundred-thousand-plus Model S cars equipped with that first-gen Autopilot. His reasoning for such paring seems… specious.
Like I said above, our attorneys are waiting to take your call.
I do have one question about the stats myself. Autopilot is usually going to be engaged when conditions are good and disengaged when conditions are questionable or bad. In other words, in good driving conditions, there will be fewer accidents than when things are dicey. You’d need geographic, time, and weather conditions to really do a proper analysis. I’d say ignore both interpretations of those stats. Insufficient data to draw any real conclusions.
73.6% of all statistics are made up right? I thought pretty much every company did this, picking the data that proves whatever point they are trying to make. You can save 30% with product XYZ. Maybe… but I could save 100% by not buying it.
Coincidentally, I just finished watching Iron Man 2. Elon Musk has a cameo with Tony Stark – that’s cool.
I can start/stop/’rewind’ the movie whenever I want – very cool.
But my digital playback device locked up once during the movie (no response to user input from the remote control) and I had to reset it….
(What am I saying? You tell me what I’m saying.)
“What am I saying? You tell me what I’m saying.”
That you’re new at parables?
Lies, damn lies, and statistics!
Yup!
We are a long way from this technology being anything more than a clunky fail-safe for tired or distracted drivers…it cannot turn your commute into leisure time.
It is inexcusable that our regulators haven’t extended the strict framework for what is and isn’t allowed on our public roads to include autonomous technology…
Why do people want more government? The people involved can never be trusted with the power leftists want them to have. AVs suit their agendas, so we’re going to be told they’re ready whether or not they are and then the world will be diminished to the level of their actual capabilities.
Off topic. It doesn’t matter why, what matters is the result. Think about it.
As the commenter above aptly put it, lane keep assist is an aid for tired or distracted drivers, not an invitation to take your hands off the wheel. Tesla still gets three raspberries from me for sticking with the “Autopilot” brand name, but in fairness they’ve made it much clearer to drivers of late that they must keep their hands on the wheel and take primary responsibility for driving, or the car will scream at them and turn the feature off.