
There’s no shortage of safety-minded autonomous technology on Tesla vehicles, but a video suggests some features could say “forget it” when asked to work.
YouTube user Kman recently posted a video showing real-world testing of the collision avoidance abilities of the Autopilot feature in a Tesla Model S 90D — tests that nearly got his friend splattered across the pavement.
In the video, first discovered by the EV website Electrek, the two friends test the vehicle’s low-speed Summon mode, as well as its Traffic Aware Cruise Control and Automatic Emergency Braking systems.
The Summon test shows the Model S’s sensors detecting a collision when a friend stands in front of the vehicle, and when he walks into its path. A frontal collision warning signal lights up in the vehicle’s gauge cluster and the Model S stops in time to prevent tears (and lawsuits).
Things get much hairier when the Model S has to avoid the same human while driving at the lowest possible speed for the Autopilot feature to work — 18 miles per hour. In these two tests, where the trusting friend jumps in front of the moving vehicle, the Tesla recognizes the roadway object but doesn’t do anything to avoid the collision.
In the first test, performed on a residential street, the system “failed to do anything but warn me, the driver, both audible and visually that I was going to collide with a (sic) object,” stated Kman. “The Collision avoidance system failed to stop the Tesla.”
The second test, on a more sparsely populated road, was worse:
The Collision Avoidance system while under autopilot waited even LONGER to alert me of a potential collision. I did have the distance setting at it’s (sic) maximum of 7, and gave the collision avoidance system as much opportunity to attempt to slow or stop the car yet again, yet again, the collision avoidance system came up shore (sic) on the Tesla S and only gave a (sic) Audible and Visual warning.
The video’s creator said he’s seen that particular vehicle’s system prevent collisions with other vehicles in the past, so he knows the feature works. Automatic emergency braking was added to Tesla vehicles last year.
There’s no shortage of media reports of Tesla vehicles avoiding accidents thanks to its collision avoidance features, just as there’s many unsubstantiated claims that a driver’s accident was the result of the system not working. It’s impossible to say why the vehicle in the video didn’t brake during the last two tests, even after detecting the impending collision, but it’s a good reminder not to leave all the big decisions to your car’s electronics.
Eyes on the road and hands on the wheel.
Just a bit more proof that we’re (engineering-wise) in no way ready for autonomous cars.
Retitled “Future Darwin Award winner submits application video”.
Reminded me of this: https://www.youtube.com/watch?v=ZO9Kt9-MFgM
Timmys are disposable.
I’m not surprised at this. Remember that Elon has only landed a booster once out of several tries with a lot more software to help out.
SpaceX has successfully landed 4 boosters:
https://pbs.twimg.com/media/CkT0GpCUoAE1L0E.jpg:large
There’s more to it than “it works.” There are also many limitations due to how it works, external factors, etc.
What is that clicking noise? Is that a normal Tesla thing?
baby/kid on back seat maybe?
(and the Darwin award comment is spot on).
All testing, especially tests where someone might be killed, should be conducted with a small child in the car.
because exactly what would happen to a child in a child seat if he’d hit his friend at 18 mph ??????
Right. What could possibly go wrong ????
Aaarrrggghhhh!!!!!
I agree, there is 100% chance nothing at all will happen to a child. The Gs he’ll experience wont be any higher vs going over a bump
Testing with a small child in the car will flush out the sarcastically-impaired readership here in comment-land. That’s what will happen…
It sounded like the kid was operating an adding machine.
My grandma has one of those, which she prefers to use to a calculator. I think it may be the most well-made machine I have ever seen. It has to weigh 30 pounds.
It’s this one!
https://img1.etsystatic.com/013/0/7718043/il_570xN.459491017_kxws.jpg
Maybe it was an adding machine/giant calculator on the car salesman’s desk. You know, the one they use to madly mash random buttons to make it look like they’re writing down meaningful numbers on the “four square” worksheet. It’s part of the “car buying experience…”
Ziggy says there’s a 34% chance we’re going to splatter your best friend on the hood, and a 2% chance Musk adds an “autonomous pedestrian scraper” to the windshield of next-gen models.
Ask Ziggy why I haven’t leaped yet?
Was the driver of this car that self-learning “Promobot” in Russia (the one from the strange news, the one that got away and trundled down the city street)?
That looks like a GM parts bin steering wheel – one of the nicer ones but still.
Is that his vacation house he’s at – because the houses across the street look like they cost less than a Model S.
I’m sure Tesla will blame the owner.
If the Tesla’s collision avoidance systems have any weaknesses, these two are not the ones I’d want to have designing the test cases that expose them.
There, I said it.
So you all seem to think there is a fault in the cars software. Think again.
Don’t forget that Musk believes we are most likely living in a simulation; the car AI maybe optimized to score the most points by killing pedestrians.
You call it a bug, Musk considers it a feature.
Hey, I used to be pretty good at the original Carmageddon video game…
It has become self-aware:
“https://www.youtube.com/watch?v=AiaHrPwSG-Y”
So the car can definitely ‘see’ a person standing still or walking out in front of it, but reacts differently in self-driving mode versus summon mode.
I can’t really judge beyond that, because I don’t know, for example:
– does the car sense whether a person is in the driver’s seat?
– or if the driver is touching the wheel? Driver’s eye position?
– how did he select the speed of 18mph, and would that affect the car’s response? What happens at 30 or 45mph?
– was he really not touching any controls during the approach?
Finally, his test method was extremely irresponsible, especially on the first go where he didn’t seem to have planned to ensure that the car and the ‘target’ didn’t both try to swerve in the same direction.
Does the car’s software account for different types of behavior and target maneuvering? Not all animals run straight across the road. Deer usually do that but bunnies usually run halfway across, quickly reverse, and run back the way they came (I think they are instinctively trying to get away from a predator). It took me a couple of messy bunnies to learn about that. And so far my only contact with a deer was when the deer ran into the side of my car… definitely his fault and not mine… and he bounced off and ran away. I strongly suspect that that deer was uninsured.
I wonder (because of your comment) the size threshold for avoidance. Squirrel? Bird? Cat?
I don’t want to cause myself to get rear ended because there was a stupid squirrel in the road. Just run it over.
It’s funny how the sum total of sensors that see much more than the human eye and ear (if you were a BSG fan, think of Cavill’s “I don’t want to be human” speech) still can’t seem to add up to the perceptiveness of even a fairly dumb human brain with only visual and aural input to work with.
You and I can instantly judge whether to run over a squirrel, dog etc or to swerve to avoid a human even if we know we’ll crash as a result. The computer working with sonar returns can’t really even be sure whether it’s a dog or a child or a person bending down to tie their shoe. I think it’s ultimately going to take AIs far more sophisticated than anything we have today to make self-driving cars a viable proposition.
“You and I can instantly judge whether to run over…”
Do Teslas dream of electric sheep?
I foresee new law firms forming up for a new specialty in “autonomous vehicle injury.” The tv commercials will be funny to watch.
Human drivers have to pass a drivers test, Autonomous systems should be held to minimum standards of operation.
As long as tests are independently administered that will work. If auto manufacturers are able to self certify their vehicles we will have robot-gate.
I don’t really understand: “Human drivers have to pass a drivers test, Autonomous systems should be held to minimum standards of operation”.
Are you trying to say that water is watery?
and this one: “As long as tests are independently administered that will work”
are saying the butter is buttery?
Just wanted to add something really valuable: drivers must use mirrors so we dont have mirror-gate
Unless the tests test for all possible scenarios, known and unknown, it is the easiest think in the world to game them. Humans can reasonably be expected to extrapolate reasonably, when faced with unforeseen situations. AIs not so much, at least not as of yet.
Let’s see, the guy’s pant looks like it is blended into the shadow of the road and his shirt looks like part of the background object….. so there’s no one in front of the road.
This is why only using image recognition is dangerous for self driving: if someone wears a camouflage you will for sure run him over.
That’s fine. Tesla Autopilot only brakes for handsome, well-dressed, high-net-worth individuals anyway, and the working class are the ones wearing camouflage in public.
I try to run over people wearing camo whenever possible, just to prove to them how effective their attire is.
I’m waiting for the first .460 Weatherby versus rogue Tesla videos.
I gut suckered into touching off one of those once. Per my front of the shoulder estimate, the recoil would be about the equal of a Model S impacting at between 45 and 50 :)
Yeah, I’ll leave that to the bwana sahibs out there. Since even the angriest Tesla can only charge at 18 mph, I’ll rely on my feet and adrenaline.
Without a grille, the Tesla loses it menace. Did someone write it a script for Abilify?
I also foresee a new movie for the Star Trek series. It’s called “The Wrath of Elon” The year is 2035…Tesla autos has become the defacto automobile and mode of transportation for all plebeians on a planet called Euphemism. After successful progress of SpaceX, Elon Musk conquered his first planet and is in sole control of planet Euphemism. All criticism of his automobiles are met with swift death, whereby the owner is run over, trapped in their car and driven off cliffs or crashed into trees. It is up to the crew of Star Trek to return order to planet Euphism and stop the evil Elon from the genocide that is being wrought on planet Euphemism.