By on July 30, 2016

2016 Tesla Model S

Tesla CEO Elon Musk’s drive to develop and market new driving technology is well known, but former employees say he brushed aside their concerns about the safety of the company’s Autopilot system.

Several employees, including a former Autopilot engineer, told CNN Money that their concerns fell on deaf ears, as Musk always reverted back to a “bigger picture” position on safety.

The automaker’s semi-autonomous driving system came under scrutiny in the wake of a fatal May crash. Musk claims that although the Autopilot didn’t recognize a transport truck in that case, the system makes roads safer. He’s pledged to do more to educate owners on how to properly use Autopilot, but has no plans to stop offering the system.

Musk told the Wall Street Journal “we knew we had a system that on balance would save lives.”

Speaking to CNNMoney, ex-Autopilot engineer Eric Meadows claims he was pulled over by police in 2015 while testing Autopilot on a Los Angeles highway, a few months before the system’s release. The Tesla had difficulty handling turns, and the police suspected him of being intoxicated.

Meadows was later fired for performance reasons, but he claims his worries about Autopilot’s safety — especially the possibility that owners would “push the limits” of the technology — grew over time.

“The last two months I was scared someone was going to die,” he said.

The report mentions a former Tesla executive who worked closely with Musk, and claims the CEO was frequently at loggerheads with “overly cautious” employees. Tesla’s self-parking feature went ahead as planned, another source claims, despite worries that sensors wouldn’t function properly if the vehicle was near the edge of a steep slope. Again, the greater good of preventing driveway deaths overruled these concerns.

The employee mix at Tesla falls into two categories — younger, data-driven employees and seasoned automotive industry types. The report cites multiple sources who claim that data is the guiding factor in Tesla’s decisions, meaning slight risk is allowed if it means a greater potential for overall safety.

While this bothers some engineers and consumer safety groups, even the agency investigating the May crash sides with Musk’s views on safety. Recently, National Highway Traffic Safety Administration administrator Mark Rosekind said the industry “cannot wait for perfect” when it comes to marketing potentially life-saving autonomous technology.

[Image: Tesla Motors]

Get the latest TTAC e-Newsletter!

Recommended

145 Comments on “Musk Pushed Back Against Tesla Employees’ Autopilot Concerns: Report...”


  • avatar

    BMW, GM, Toyota, etc, aren’t stupid. If they could get this to work for the lowest common denominator customer they would. I think we are one full generation away from this working…compare your flip phone to the smart phone. Tesla is the Brick Phone….it will get better over time.

    I love tech, but still would not trust it with my life in this situation, yet….

    • 0 avatar
      Russycle

      A few years ago you could argue that if it were possible to build a battery-powered performance sedan BMW, GM, or Toyota would have done it. And you would have been wrong. I’m not saying Auto Pilot is great, but the fact that other companies haven’t implemented it doesn’t mean it can’t work.

      • 0 avatar
        derekson

        “A few years ago you could argue that if it were possible to build a battery-powered performance sedan BMW, GM, or Toyota would have done it.”

        The argument is: If it were possible to build a BEV performance sedan AT A PROFIT, one of those manufacturers would have done it. And that remains true.

      • 0 avatar
        EBFlex

        “A few years ago you could argue that if it were possible to build a battery-powered performance sedan BMW, GM, or Toyota would have done it.”

        No, that is not even remotely analogous.

        • 0 avatar
          pragmatist

          Thats exactly the point. Tesla is not profitable and would be even much worse without corporate welfare.

          There is no magic in a high performance electric, one-offs have been tearing up drag strips for years. Tesla took government money and a strong aura of mystique, an attractive design (at a time when most electrics were butt ugly) and convinced a lot of wealthy people that this was status (like many other high priced boutique products).

          That’s not bad, simply marketing. But there is nothing technologically ground breaking. Musk in no way violated the laws of physics or even current levels of technology.

          What GM, BMW, Toyota know, however, is that the legal climate for automobiles is very different from cellphones. As someone deeply involved in the computer world myself, I’m all for driver assist safety technology, but computers are not ready for prime time when it comes to taking over all functions of driving.

          • 0 avatar
            VoGo

            Tesla took much less money than other domestic automakers, and paid it all back, early.
            Where’s our $10B from GM?

          • 0 avatar
            probert

            What most car companies know is that the “legal climate” accepts that in the US alone, between 30,000 to 40,000 automobile related deaths per year is acceptable.

            Tesla is the largest selling luxury car in Europe, and very successful in the US. With millions and millions of on road miles, it has proven to be one of the safest cars in the world.

            That Tesla is unique, and far ahead of the curve, is the very reason one related death, sad as it is, becomes newsworthy.

          • 0 avatar
            Pch101

            Tesla loses money. Not exactly successful.

          • 0 avatar
            JimZ

            “Tesla loses money. Not exactly successful.”

            they don’t lose money. they re-invest their profits into R&D.

            ;)

      • 0 avatar
        tedward

        Russcycle

        The problem I think is that all companies have very similar systems. Tesla was simply the boldest in advertised claims and set theirs up to encourage hands free driving (hands off the when allowed for minutes at a time). None of the systems, including tesla’s, are ready for true hands free operation.

        I find it interesting that the article implies the influence of younger, more data driven and assuredly risk tolerant, employees were the driving force behind the decisions. I can see how mr. Musk would be open to that influence given his resume and ambitions, but I think he may have little concept of the scale of liabilities and consequences when it comes to the distribution of personal battering-ram like devices. Making street cars might best be left to the old fogies.

        • 0 avatar
          TonyJZX

          You know what? I’m ok with this. Sometimes you need to push forward your dream or you would be stuck with everyone else burning gasoline and driving yourself.

          If the technology costs some people dying while watching Harry Potter while speeding then yeah… I salute you sir for dying for the betterment of all of us. Notice all the ‘ex employees’?

          All I’m seeing here is a bunch of faceless suits slash labcoats saying why we cant and Musk saying why not? And they got fired for it which is what they deserve.

          • 0 avatar
            Kenmore

            “Sometimes you need to push forward your dream…”

            A famous farty guy expressed similar sentiments to his timid generals in 1939.

          • 0 avatar
            Carrera

            “Dying for the betterment of others” is a personal choice, but when one of these missles takes out a minivan because the driver of the Tesla was watching Harry Potter or whatever, that I am not a fan of. We already have plenty of idiots on the road already. We don’t need imperfect technology to help them.

          • 0 avatar
            WheelMcCoy

            @Kenmore – “A famous farty guy expressed similar sentiments to his timid generals in 1939.”

            You mean he was pushing his farts forward? I’d be timid too. Have you smelled them?

          • 0 avatar
            WheelMcCoy

            @TonyJZX – “All I’m seeing here is a bunch of faceless suits slash labcoats saying why we cant and Musk saying why not? And they got fired for it which is what they deserve.”

            I don’t think that’s what the labcoats are saying. They’re saying it can be done, but not in the given time frame, using current technology, and within the existing budget.

            I hope Musk is not pulling a VW. If he is, then he would come under investigation, and that’s what he would deserve.

          • 0 avatar
            WheelMcCoy

            @Carrera – “We already have plenty of idiots on the road already. We don’t need imperfect technology to help them.”

            Tesla already has “Ludicrous” driving mode. Maybe they need to rename Auto-Pilot to “Idiot” driving mode.

            It might shame people into using it carefully. Then again, people might embrace it the same way they have with the “Dummy” and “Idiot” series books.

          • 0 avatar
            JimC2

            “Tesla already has “Ludicrous” driving mode.”

            They’ve gone to plaid!!

          • 0 avatar
            mchan1

            “If the technology costs some people dying while watching Harry Potter while speeding then yeah… I salute you sir for dying for the betterment of all of us. Notice all the ‘ex employees’?

            All I’m seeing here is a bunch of faceless suits slash labcoats saying why we cant and Musk saying why not? And they got fired for it which is what they “deserve”.”

            What arrogance!
            Are you a millenial?! Sounds like it!

            People don’t ‘deserve’ to DIE to fulfill some auto person’s (whose company isn’t profitable and is existing on government funds!) view that auto pilot cars are worth it in the end, despite how many human lives may be ‘sacrificed’.

            You better hope that none of your relatives or friends don’t wind up as an accident because of any Tesla autos, you arrogant pr#ck!

          • 0 avatar
            Pch101

            Neither the laptop nor a DVD player also found in the vehicle was running after the crash, said Sergeant Kim Montes of the Florida Highway Patrol. Montes said investigators could not determine whether the driver was operating either at the time of the accident.

            The car was equipped with a computer stand, but the laptop was not mounted on the stand when investigators recovered the laptop, Montes said.

            http://uk.reuters.com/article/us-tesla-autopilot-idUKKCN0ZN1XX
            __________________

            Not mentioned in that article was an eyewitness who claimed that no video was playing at the time of the crash.

            It would be helpful to remember that the Harry Potter accusation was made by the trucker who made the illegal left turn that caused the crash and killed a man. Not exactly a credible source — first he kills a guy, then he tries to blame the victim.

          • 0 avatar
            DenverMike

            A DVD player was found in the car. Facts on this depend on the source. Except it’s very unlikely the trucker would fabricate a complete lie, right down to a specific movie.

            But it’s irrelevant what movie the Tesla driver was watching or not watching. His attention definitely wasn’t on the road.

            autonews.com/article/20160701/OEM11/160709985/dvd-player-found-in-tesla-car-in-florida-crash-authorities-say

          • 0 avatar
            Pch101

            The facts are in the crash report.

            There was plenty of rumor in the press prior to the crash investigation. You might want to notice that the rumors reported on July 1 were subsequently addressed on July 7 by the Florida Highway Patrol.

            The Harry Potter rumors were started by the truck driver. The same truck driver who failed to yield as required by Florida law and who then subsequently killed a man because of his illegal turn.

            It’s funny how you like this guy — you know, the one who killed somebody — so much. What is it about his driving style that gives you this emotional connection?

          • 0 avatar
            DenverMike

            You can’t kill someone who committed to suicide. It’s no different than if a walking Tesla dude had dove from the curb into the path of a truck. Pedestrians always have the right-of-way, so any vehicle that hits any pedestrian failed to yield. That doesn’t make it manslaughter. And as you know, no charges are being filed against the trucker.

            Except you’re not even clear on what an “illegal turn” is. An example of an illegal turn is running a red, left turn arrow

            When the execution of a turn, where it’s clearly legal to so happens to be in question, that doesn’t make it an illegal turn.

          • 0 avatar
            DenverMike

            Yes it’s a rumor the movie Harry Potter was playing the DVD player moments after the crash. The FHP didn’t see that happening when they arrived, but it hardly means it didn’t happen. By the time the FHP arrived, the movie could’ve finished playing or the car was out of power.

            Either way, it’s a bold accusation by the trucker, that he knew could be checked out.

            The FHP could’ve easily dispelled the rumor by checking what movie, if any, was in the player. But it’s not their job to do so.

            If there was Harry Potter disc in the DVD player, but it wasn’t playing, the odds of the trucker accurately guessing what movie was in there are astronomical.

          • 0 avatar
            Pch101

            So when you violate the law by failing to yield, cutting off cars and causing crashes, you blame the other guy by claiming that he was committing suicide.

            You spew the sort of rhetoric that I would expect from a sociopath.

          • 0 avatar
            DenverMike

            There’s no violation of the law, when it’s legal to make a left turn, provided it’s safe to do so. It’s yet to be determined it was unsafe for the trucker to turn, in regards to where, or how far back the Tesla was at the start of the turn, or how the Tesla’s exceeding the speed limit contributed to, or caused the crash.

          • 0 avatar
            DenverMike

            The trucker must be guilty of something other than simply turning left. The overly simplistic FHP crash report only states who had the initial right-of-way without prejudice, nor investigating if the Tesla driver contributed or caused the crash. Thanks for nothing.

            We can look beyond the FHP, mailed-in crash report, because we can think. The NHTSA is looking beyond its laughable simplicity too.

            We now know the Tesla driver was speeding and not watching the road, both contributing heavily to the crash, if not causing it entirely.

          • 0 avatar
            Kenmore

            “simply turning left”

            Lee Harvey Oswald simply flexed an index finger.

          • 0 avatar
            JimZ

            “All I’m seeing here is a bunch of faceless suits slash labcoats saying why we cant and Musk saying why not? And they got fired for it which is what they deserve.”

            this is the absolute stupidest thing I’ve read today. The “labcoats” also told Ferdinand Piech that they couldn’t do what he wanted, and now VW is paying out $15 billion because THE “LABCOATS” WERE RIGHT!!!.

            What is it with this attitude that we have to p!ss all over realists? Why do Muskophiles see engineers as some sort of obstacle to be steamrolled over? When has St. Elon ever built an actual, tangible *thing* in his life?

        • 0 avatar
          Pch101

          I provided you with the text of the Florida statute, which says the opposite of what you are saying — left turns at uncontrolled intersections ALWAYS must yield to oncoming traffic. So no, that says you’re wrong.

          I provided you with a link to a legal website that explains why left turn crashes at uncontrolled intersections are almost always the fault of the turning vehicle. So no, that says you’re wrong.

          You managed (much to my surprise) to find the crash report that notes that the truck failed to yield while the Tesla did not contribute to the crash. So the police report says that you’re wrong.

          Three strikes and you’re out. You’re wrong, wrong, wrong. There is absolutely nothing to support your position, while everything legal and factual says the opposite of what you’re saying.

          I really have to know: How many people have you injured or killed with your driving? I’m pretty sure at this point that the answer is greater than zero.

          • 0 avatar
            Zykotec

            Though I’m agreeing that the truck was at fault, as the police says, I think making a statement by charging into the truck at 75 mph is kinda stupid.
            I can almost imagine you standing in a zebra-crossing while a car or truck comes straight at you , feeling all smug about him having to go to jail after killing you, and refusing to walk any faster or even jump ,because you have the right of way…

          • 0 avatar
            Kenmore

            “I’m pretty sure at this point that the answer is greater than zero.”

            What else could possibly be the engine driving his obsessive defense of Baressi?

          • 0 avatar
            285exp

            The “driver” of the Tesla didn’t contribute to what should have been a non-event due to his gross negligence? He was speeding on a non-controlled access highway, turned control over to a semi-autonomous system that couldn’t detect a tractor-trailer rig in front of him in broad daylight, and didn’t maintain a reasonable watch, so he let the car drive into the side of trailer without touching the brakes? How long do you think is reasonable for a Tesla driver to not look out the front window in those conditions?

            In the real world these situations happen all the time, and people make minor adjustments to their driving instead of plowing into the side of an easily avoidable object just because they have the right of way.

            Maybe it’s because I’ve been operating boats for longer than I’ve been driving a car, but the rules of navigation require you to keep a proper look out at all times, and the fact that you have the right of way does not allow you to blunder into another vessel if you can safely avoid it. If it makes you feel good to blame it all on the truck driver, go right ahead, there’s certainly some blame there, but the real reason Mr. Brown is dead is that he was a fool.

          • 0 avatar
            Kenmore

            You know dang well that people here are like alkies pointing at a first-time Amish wine drinker and shouting “YER DRUNK!” when they call, what, 9-over “speeding”.

            Nothing changes the fact that Baressi put the obstacle in the road. It’s like a kid committing murder or suicide because the old man left a loaded gun lying around. Who’s most at fault?

          • 0 avatar
            Pch101

            Left turns have to yield to oncoming traffic, period.

            There is a contingent of you who are dedicated to not understanding basic driving rules. Unfortunately, that sort of stubbornness gets people killed, which is the moral of this story.

            Perhaps I should explain the concept of “yielding right of way.” That means that the driver who is turning left is responsible for being sure that he can complete the turn safely. Starting the turn does not eliminate the legal obligation to yield — oncoming traffic ALWAYS has priority.

            If there is a crash, then guess what? The turn wasn’t completed safely. The action speaks for itself — you weren’t supposed to crash, and the blame goes to the driver turning left.

            The best that Baressi the truck driver can hope for is that the Tesla ends up sharing ***some*** negligence due to the speed. But there is absolutely no way that the truck will be held 0% at fault, given that the truck has the legal responsibility to avoid turns that will result in it getting hit.

          • 0 avatar
            285exp

            Pch,

            I’m so surprised that you refuse to address the gross negligence of the Tesla driver and stick to your simplistic failure to yield mantra. Nobody is saying the truck driver is 0% responsible, you’re arguing against a strawman.

            You support using autopilot the way Brown was using it, and he had no duty to watch where he was going, and he was blameless in the crash?

          • 0 avatar
            JimC2

            “what should have been a non-event”

            Well… you and I agree that he should have maneuvered to miss the bad truck and got on with his life. But I wouldn’t say it should have been a non-event.

            My opinion, but if you have to change your speed and lane every time someone turns across your path I wouldn’t call it a non-event, not if you were driving only 9 over the posted limit. Although that happens quite all the time if you drive in Florida–some buffoon makes a left turn across oncoming traffic and forces you to hit your brakes, or some other buffoon makes a right turn into fast moving traffic and ambles along, making no effort to accelerate and match the prevailing speed of the traffic that had the right of way–so I suppose it is, in a sense, a non-event. But it still ticks me off when I have the right of way and I have to maneuver around the other guy who did not.

            Anyway, I’d rather be wrong and alive than right and dead. If getting cut off makes me mad enough, I can honk, I can call the 1-800-how’s my driving number on the truck (“doo doo doo, this number has been disconnected”), call the highway patrol (good luck), get in front of the person and spray them with my windshield washers (most satisfying and keeps the bugs off my windows), or get over it and move on with my life.

          • 0 avatar
            Pch101

            The obtuseness is unbelievable.

            Left turns ALWAYS have to yield.

            You should have learned this when you were 16 years old. How did you get a driver license without understanding something this simple?

          • 0 avatar
            tedward

            Pch

            There’s also a convention or cultural norm that drivers of large vehicles are to be paid close attention to and that a driver may be expected to yield to them in abnormal situations (included in state dmv training booklets even). I can’t speak to the road conditions in this accident not seeing them first hand. But, if a truck was making a lengthy left in front of a police officer, and a car with right of way came in hot and had to panic brake, there’s every chance the Trooper would consider reckless or inattentive driving citations for the right of way driver, especially when it was traveling 9 over (which normally would not constitute highway speeding on many roads).

            In terms of assigning blame for insurance pay outs, juries can actually have quite a bit to do with it. The state guidelines are used as reference points, but the percentage of blame doled out can vary widely depending on the circumstances and quality of arguments presented.

          • 0 avatar
            Pch101

            I’ve already posted an article from a credible legal publisher that explains why those making left turns are almost always at fault.

            The police report is consistent with that discussion. It found the truck driver to be at fault, and did not find the Tesla driver to be at fault. To a cop investigating the scene, this would be blindingly obvious, as the left turn must ALWAYS yield to oncoming traffic. Not sometimes, but ALWAYS.

            I hope that all of you are seeing why driver education doesn’t help to improve safety. When you guys believe things that aren’t true, no one is going to get you to change.

          • 0 avatar
            285exp

            Pch,

            I agree about the obtuseness in this thread, you’re certainly contributing to it. I know very well who has the duty to yield, and I also know that they don’t, it will be small consolation if I kill myself because I assume everyone out there will yield when they’re supposed to. Florida traffic law requires that a driver maintain awareness, that they not take their eyes off the road for more than a few seconds, and that they do everything they can to avoid a collision. Mr. Brown was breaking the law too, and he was a fool because he knew that the system was not designed to be used in the way he used it. If the truck truly had pulled out suddenly in front of this car, I’d have no issue with your assigning all the blame to him, but we all know that’s not what happened.

            As for his speeding, if he had been driving at the posted speed while not paying attention to where he was going, he and the truck would never have ended up occupying the same space at the same time.

          • 0 avatar
            Pch101

            You really don’t understand the concept of yielding the right of way. If you did, you wouldn’t be spewing that nonsense.

            It makes no difference whether Brown was distracted. The truck driver was obligated to yield the right of way, while the Tesla was not.

            That’s the meaning of the left turn ALWAYS having to yield. You can’t blame the oncoming driver for the crash because the oncoming car had a right to be in that intersection, while the turning driver did not.

          • 0 avatar
            285exp

            Pch,

            Just so I understand your position, the Tesla driver was not negligent in operating the car, and he in no way contributed to his own death? Yes or no, none of the “you don’t understand the concept of yield” nonsense. Yes or no.

          • 0 avatar
            Pch101

            Ironically, you keep yammering about the Tesla driver’s need to pay attention, when you won’t pay attention on these threads.

            Have you noticed how what I’m telling you matches (a) the law and (b) the police report?

            You ought to be noticing that of the two of us, I am the only one who can actually provide support for his argument. And that’s because I’m not really arguing, I’m just explaining the law to a bunch of guys who are clueless about the law (and who have no business being behind the wheel.)

            The police report found the truck driver to be 100% at fault. When you figure out why that is, then all of this will become much clearer to you. Anyone who understands the law would have known that the crash report would fault the trucker, because left turns ALWAYS have to yield.

          • 0 avatar
            285exp

            A simple yes or no, not three paragraphs of why I am unqualified to drive, yes or no.

          • 0 avatar
            Pch101

            You don’t know basic road rules, even though the rules are simple enough for a teenager to understand.

            Why should you have a license when you appear to be incapable of understanding the rules?

          • 0 avatar
            accord1999

            Why would you need to make evasive maneuvers? A slow moving truck that still managed to get 40+ feet of itself off the highway after going through a wide median and the two lanes of highway would have been spotted by an attentive driver 10+ seconds away.

            Unless he had a medical problem, what the Tesla driver did by not paying attention is no different than if he were to have blindfolded himself before driving and was going to either kill someone or be killed by something eventually.

          • 0 avatar
            Pch101

            The cars that you cut off have no legal obligation to make evasive maneuvers.

            A crash that is caused by your failure to yield will be your fault.

            If that crash doesn’t happen because the other driver responded in time to your stupidity, that does not change the fact that you failed to yield in violation of the law.

            Other drivers have no legal obligation to save your dumb backside with their superior driving. It would be nice if they do, but it is on you if they don’t.

            The law does not require anyone to be smarter than you are. You’re just fortunate that they are.

        • 0 avatar
          JimZ

          the problem is that “younger, more data driven” set has no experience working on things where defects or design flaws lead to someone dying. Almost every other automaker has had experience being hauled in front of Congress for something of theirs which has killed people.

          • 0 avatar
            DenverMike

            Pch took the stance early on, reminding us there’s absolutely nothing the Tesla driver could’ve done to avoid the crash. Now he has egg on his face and retreating to the “failure to yield, turn in your licence, and how many innocents have you killed(?)” pathetic mantra and ad hominem combo.

          • 0 avatar
            Kenmore

            “Pch took the stance early on, reminding us there’s absolutely nothing the Tesla driver could’ve done to avoid the crash.”

            Purest BS, Mike. He never said or implied that because it’s irrelevant to the issue of fault here. There shouldn’t have been a broadside semi chugging across the road to attempt avoiding in the first place.

            BTW, after several promptings you still haven’t given a simple “No, I haven’t.” to speculation about whether you’ve also caused a crash by turning across traffic.

            PS: If you did and I missed it because your comments are so infuriating I now only skim, apologies.

          • 0 avatar
            Pch101

            I really would like to know how much blood that Mike has on his hands.

            I mean, this shouldn’t be that hard.

          • 0 avatar
            DenverMike

            At least I’m willing to ask the questions beyond the simple minded, and superficial. Articles like the one’s you’re commenting on, don’t stop before the story begins.

            Except if that’s the way you prefer it, feel free to submit your own articles.

            You simpletons were ready to form a lynch mob before the facts were in. And all the facts still aren’t in yet.

            Here’s Pch spouting off, even before the hack report by the Florida Highway Patrol was in:

            “There would have been no evasive maneuvers. The truck is longer than the width of the lanes that it was blocking.”

            Why does’t Pch101 state, or reaffirm that now??

          • 0 avatar
            Pch101

            I still want to know how many people you have injured or killed behind the wheel.

            And your questions are not particularly bright.

            With a 53 foot trailer, the truck would have been about 73 feet long.

            The two eastbound lanes, plus right-hand shoulder, were about 29 feet wide.

            A typical lane width of that type of road is 11-12 feet.

            It doesn’t take a genius to figure out that a truck that is as long as the width of about six lanes would have been blocking both eastbound lanes, plus the shoulder, at the time of the crash. There would have been no escape route.

            And the Tesla was not legally obliged to escape. Some of you folks need to learn the basics of traffic law — when you screw up your left turns, it is not the job of oncoming traffic to compensate for your screw up.

          • 0 avatar
            DenverMike

            Thank you. The combination wasn’t parked “blocking” the lanes. It was fast moving, and the FHP agrees at about 35 mph. Except the Tesla was moving faster, much much faster. Over the speed limit, and too fast for its own good, especially when totally relying on a defective system to do all the driving.

            The gap, or break in traffic was also much greater than the typical car needs to get across safety. We know this because the Tesla didn’t hit the front of the combination.

            The trailer would’ve cleared the intersection if the Tesla was driving the speed limit. That’s safe to say based on what we know, and the hundreds of feet of gap/break in traffic involved here.

            The Tesla driver only had to let off the accelerator enough to drop down to the speed limit he was violating.

            As drivers, all we have is our perception. We’ve all been caught when executing a legal turn, where an unlawful speeder closes the gap faster than we perceived at the point of starting the turn. That doesn’t make it an “illegal turn”, far from it.

            But every time you comment on driving situations, it seem like you’ve spent zero time behind the wheel and have nothing to relate to, when it comes to the physics involved. But that sure doesn’t stop you from commenting.

          • 0 avatar
            Pch101

            So how many people have you killed or injured with your driving, Mike?

            This is not a rhetorical question. You now have me convinced that the answer to that question isn’t zero, and I’d like to get some details about your body count.

          • 0 avatar
            DenverMike

            Wow. The ad hominem. Thanks, real shocking coming from you.

            If that’s all you’ve got, Good Day Sir.

          • 0 avatar
            Pch101

            It must be embarrassing for you to admit that your driving style has hurt other people.

  • avatar
    JimC2

    Kinda ironic, all this fuss about the autopilot not being perfect, considering the licensing standards for human drivers…

    Nonetheless, it might be interesting to learn more about Meadows’ story (his side of the story and other people’s sides of it).

    • 0 avatar
      ToddAtlasF1

      Are you one of those people that shouldn’t be trusted with the keys? Why do you think you can make others that aren’t hand over driving responsibility to fallible programmers? Do you really think that a huge percentage of drivers know they are worse than a machine that might behead them? It’s easy to say other people are unqualified accidents waiting to happen, but how do you convince said people to give up their heads…I mean their control over their cars?

      • 0 avatar
        JimC2

        You’re reading a whole lot into my comment and putting words into my mouth, doncha think?

        Notice I didn’t say *what* we should or should not do, nor what we should do differently.

      • 0 avatar
        Vulpine

        Would you believe there are people out there who believe software programmers are perfect–that software is so extensively tested in the lab that it is impossible for said software to do something unexpected simply because it wasn’t designed to do it?

        • 0 avatar
          JimC2

          @Vulpine- I do believe it. That’s one of the differences between a software programmer and a software engineer. (That’s not to say that all programmers believe their software is perfect; rather, it’s to say that it’s part of the software engineer’s job to plan against “What Ifs?”)

          • 0 avatar
            mcs

            I’ve co-designed aviation collision avoidance systems, so I have some first-hand experience. We were obsessive about the code. Lots of long discussions about every small section of code and potential problems with it. We even had the FAA looking over our shoulders.

            The MobilEye/Tesla system had/has a major flaw. It saw what it thought was a highway sign or an underpass, but failed to make the “can I fit under the overpass or sign” calculation.

            A big problem with the current automotive systems is that the designers seem to think that you can just sit back and depend 100% on being able to react to every situation as it occurs. You need to anticipate problems. An autonomous vehicle system should attempt to build multiple models of what each section of road might look like when the vehicle passes through. For example if an autonomous vehicle is headed towards an intersection and a vehicle coming in the opposite direction starts to slow down on it’s approach, you have to start calculating the odds of it turning left without signaling into your path. If the vehicle keeps moving and the wheels start to turn into your path, increase the odds. If it stops, decrease the odds. If it’s determined there is a high probability, start to slow down and prepare to stop. A system that simply reacts won’t start reducing speed until the other vehicle starts to head into the path of the AV and then it might be too late to stop.

            In the case of the Tesla crash, a more intuitive system might have seen the truck slowing for the intersection, predicted a potential turn, and maybe scrubbed some speed. It also could have started to track the truck in it’s models of potential scenarios and when it turned into it’s path, it would have known it was a truck and not an underpass or sign and even compensated if the sensors were blinded. There was a truck beginning to cross my path, now my sensors are blinded, but my last calculation would put the truck in front of me – crap – hit the brakes.

            It takes a lot of computing power, but I’m predicting they will replace MobilEye with NVidia’s technology which will give them the horsepower they need.

          • 0 avatar
            JimC2

            Thank you, mcs. You explained it thoroughly and with a lot of relevant detail.

            I’ve expect future “defensive driving” software to us some strategies that motorcycle riders use (look which way the steering wheels are pointed, watch to see if the wheels are rotating or not, as that is a good “tell” that the vehicle has started to move). Those kinds of things might take a lot of computing horsepower, as you describe- not for a clear, sunny day, but when the vehicle is obscured by obstacles, glare, weather, etc.

          • 0 avatar
            DenverMike

            OK, the MobilEye missed seeing the trailer, but did it also not see the tractor the Tesla narrowly missed? What about the set of dual-tandems moving fast towards the Tesla’s lane? If there’s two vehicles or large objects passing right in front of it and right behind it, does it think that’s perfectly normal?

      • 0 avatar
        stuki

        ” but how do you convince said people to give up their heads…I mean their control over their cars?”

        By providing a big screen for them to watch movies on…..

    • 0 avatar

      Exactly!!

      While Musk’s “Bigger Picture” view is accurate IMHO, sometimes it doesn’t pay to be ‘right’ all the time. The employee concerns were hopefully more to do with what the media and public would do with the first fatality, just as occurred with the first battery fire.

      Musk weathered the storm of battery fires, primarily by adding better titanium protection.

      Fixing the software systems for autonomous driving isn’t a physical fix and more difficult to verify. Regaining public trust after the media blitzkrieg may take a lot more twitter posting than occurred with the battery fires.

      I hope Musk is able to weather the media blitzkrieg once again.

  • avatar
    Felix Hoenikker

    NHTSA should require a warning sticker on Tesla cars stating “Use of the Auotpilot may result in severe injury or death. Have a nice day”.

  • avatar
    CH1

    Phrases such as “cannot wait for perfect” and “don’t let perfect be the enemy of good” are just aphorisms. They are useless for determining whether the correct balance was struck between safety and getting stuff to market quickly, since all man-made systems are imperfect.

    Every reasonable person agrees that we “cannot wait for perfect.” Rosekind is merely stating the obvious and isn’t necessarily agreeing with the specific decisions Tesla made.

    • 0 avatar
      NickS

      This. I shuddered when I read Rosekind’s statements. I’d hope he understands the differences between driver assists and replacements and the need to fence them so that drivers don’t misuse one as if it is more than it can be.

      A driver needs to be safety-conscious if they are expected to have ANY involvement in the task (with or without assists). At a minimum they need to train all owners before enabling their AP, or disable it after x close calls. Ideally they need to make sure that the nannies are not easy to defeat or fool. Humans are imperfect and if they require human attention (even only for legal reasons) they’d better make sure they are getting it.

      • 0 avatar
        JimC2

        “At a minimum they need to train all owners before enabling their AP”

        Not trying to argue with you, NickS, I just don’t see this as feasible.

        Most people don’t even read their owners manual when they buy a new car (a lot of them don’t know such a thing exists). Early adopters might be more willing than average to accept training or at least read up on the AP, but I thing most consumer’s mentalities is a huge obstacle to this. We live in a society where any schmuck with enough cash can drive a 700hp car…

        (Maybe the car would not enable the autopilot until the driver has a pass code they can only get from passing a rigorous knowledge exam and practical exam. Heheh, that would go over well in the marketplace.)

        • 0 avatar

          Wait a minute here. What kind of a suggestion is that? We need to train drivers how autopilot works? Seriously?

          If training helped to avoid accidents we wouldn’t need autopilot at all.

          At some point we have to come to grips with reality. There are a lot of stupid people running around. If you’re too stupid to drive a car, you’re way to stupid to be trained on avoiding accidents while your autopilot is activated.

          It autopilot exists in a vehicle, it enables and encourages people who are otherwise incapable of operating the vehicle safely.

          That’s what it is. That’s what it’s existence is for.

          You can’t fix stupid.

          • 0 avatar
            JimC2

            You can’t fix stupid, but that car has already left the garage. Like it or not, driving has evolved from being a privilege to being a right, enjoyed by all- including stupid people.

            It’s why we have things like mandatory computerized tire pressure sensors (check the tires for stupid people) and moving towards mandatory computerized vehicle stabilization (preventing stupid people from rolling over their obviously top-heavy vehicles).

            Darwin was wrong…

      • 0 avatar
        NickS

        I was talking about how to make the *existing* cars out in the wild just a little safer. It’s not a good solution, just a rubber band – that’s what I meant by bare minimum.

        Jim, no disagreement whatsoever. Everyone who already has AP on their Tesla will revolt and call up their lawyer if they are now told they need to certify or lose it.

        @WR – it’s not an auto-pilot that removes driver involvement completely, according to Tesla. The problem is that Tesla is playing very dangerously around the issue of how much involvement they want their drivers to have. The Tesla system has weaknesses, and they also need to enforce attentiveness. They have to find a way to remedy both of these: knowledge and understanding of the weaknesses, and more nannies and stricter access rules.

        Airline pilots have to be trained on and be certified to use AP.

        In a fully autonomous vehicle you shouldn’t need training. The Tesla is nowhere near fully autonomous.

        As an engineer I shy away from calling the end user stupid (especially someone who lost their life). If an untrained person is misusing a product it’s usually an indication of very poor/cheap design, or lack of training.

        Google and many other OEMs have taken a far more responsible stance in autonomous tech. I work I the valley and Musk isnt top of mind with everyone here.

      • 0 avatar
        Big Al from Oz

        NickS,
        I agree with you in relation to reading the owners manual scenario. Most people feel they are more than adequate to operate a vehicle, just like most over rate their driving abilities.

        This education move by Elon Melon is to cover his ass. The higher up managers where I work have the same approach to safety.

        I do believe they genuinely want a safer workplace, but their action with the so-called awareness training is to reduce their responsibility if any event occurs, ie, cover their asses. This is what Musk is doing, by making his statement of providing training for the use of his auto pilot feature …… covering his ass.

        Musk like all of us wants a safer driving environment, but his ego and Tesla outweighs a logical outcome of ensuring the auto pilot system must be safer before letting loose on the public.

        Not all who operate a vehicle should.

        • 0 avatar
          tedward

          I have a friend taking delivery of a car today equipped with adaptive cruise, self parking, lane correction and all that. I didn’t help him pick his car, but I did read him the riot act about letting the salesman complete a walk through before he left with his lease. Normally there’s no way in hell I’d bring that up to a guy who drives everything well, but these systems are potentially dangerous to know a little bit about.

  • avatar
    Kenmore

    The rats are already leaving. Sooner that I thought it’d happen.

  • avatar
    dtougas

    I am not saying the engineer was wrong, but, being a (former) engineer myself I will say that no system is going to be absolutely without fault. Life has too many variables to safely deal with every possible situation. No matter how good you make your autopilot system, there will be accidents, and there will always be somebody somewhere saying “I told you this would happen”.

    • 0 avatar
      TrailerTrash

      Then it is not an autopilot system.
      If it cannot be pilot…then it is simply the assist.

      Why is this so hard for folks here to understand?
      I know why Must cannot…because he is a power driven fool. Full of himself and his ascension and the self appointed of the Holy Quest For Perfect Green Celebrity.

      He is a super hero in his own mind.

      Anybody who has studied the human brain and injuries such as stoke knows the true incalculable powers of the mind. And they understand its powers to gather, store, calculate, anticipate and relate.

      The human brain’s power to see new, never before seen situations and words, yet calculate what they are or might be or mean in relation to what was and is current.The incredible power of the mind and how it sees the present and gathers and stores and is always relating to the past and anticipating the future.

      There ain’t a programmer or programmers alive that can duplicate this. Not at Tesla or anywhere else.
      Never. At least not in Musk’s time.

      So stop, please, calling this a auto system that can replace the human skill and mind behind the driving of cars.

      THis system might be what the calculator was to the slide rule…but it isn’t even in the same ballpark as the mind.

      • 0 avatar
        JimC2

        “Then it is not an autopilot system.
        If it cannot be pilot…then it is simply the assist.”

        Not sure you’ll get anywhere with that argument. It’s semantics and it’s just one of those things where the driver is supposed to be smart enough to still pay attention.

        The most basic airplane “autopilots” do little more than make the airplane go straight (do nothing with the engines as do nothing to make it climb or descend). And those are considered autopilots in the correct sense of the word. The really nice ones fly to waypoints, climb, descent, speed up, slow down, and even land. But none of that absolves the human pilots from paying attention.

        Cruise control in a car only makes the car stay at one speed. To me, the act of “cruising” involves more than just maintaining speed. Maybe that component of my car should be called the “speed governor” since that’s the only thing it controls when I’m using it to assist me in cruise.

    • 0 avatar
      JimZ

      “I am not saying the engineer was wrong, but, being a (former) engineer myself I will say that no system is going to be absolutely without fault.”

      yes, but the problem is you have a duel between people who know the consequences of those faults and are advising caution vs. people who don’t understand the life or death implementations.

  • avatar
    CH1

    I want to see the detailed analysis justifying the Musk’s claim the system would on balance save lives and showing the number of driveway deaths that would be saved by remote parking.

    The stats that Musk and Tesla quoted, going back to the first claim a few months ago about Autopilot reducing crashes by 50%, pretty much destroyed any credibility they might have had.

  • avatar
    NickS

    I’ve worked with a few CEOs who had a grand vision, but rammed some stupid manifestation of it down the line and the people who actually knew a thing or two about the technology left for better opportunities.

    Unless we have personnel records it’s hard to know why someone left. I have seen stellar performers being forced out or dismissed for ludicrous reasons or for speaking truth to power (and often the power was on their way out with a golden parachute anyway).

  • avatar
    LS1Fan

    Has anyone actually seen human beings drive?

    Oh sure, WE don’t jack things up; but we’re the automotive version of the flat earth society.

    The rest of the driving public is eating Taco Bell while flying down the road at mile per minute speeds, while simultaneously taking Tinder selfies and commenting on some BFFs Instagram account.

    Yes, I wholeheartedly believe Teslas data when they say Autopilot will and has saved lives. Whether the customer understands how it’s supposed to work versus actual application is frankly irrelevant. Even a half baked beta test automated driving computer is way better then Average Human Being 1.0.

    Especially if said human is drunk, high, distracted, or all three. The fact that a seasoned engineer doesn’t have the long term vision to understand that might be why he got canned in the first place.

    • 0 avatar
      CH1

      Tesla has yet to provide any data that substantiates their claims. Worse, Tesla has made clearly invalid statistical comparisons; such as concluding that Autopilot reduces crashes by 50% by comparing crash frequencies when Autopilot is active versus inactive. Apparently the fact that Autopilot can’t be used in the situations where most crashes occur is of no consequence.

      Autopilot (lane keeping plus ACC) has no mechanisms to reduce crashes by 50%. The main safety benefit of Autopilot is lane keeping. Unintentional lane departure is one of the lowest causes of accidents – roughly 2.4% of all crashes. The potential number of crashes that could be avoided or mitigated by FCW/AEB, side view assist (blind spot monitoring), lane departure and adaptive headlights combined is only 32% of all crashes, mostly due to FCW/AEB (20%).

      http://www.iihs.org/iihs/sr/statusreport/article/45/5/2

      Then Tesla moved on to comparing the fatality rate when Autopilot is active to the overall fatality rate in the US and even third world countries. On one hand you have a recently designed, heavy, mid-size premium sedan in relatively low risk situations (limited access highways, good weather, etc) while on the other you have vehicles of all ages, types, and sizes under all road, traffic and weather conditions.

      Any company that makes those kinds of absurd statistical comparisons cannot be trusted.

      • 0 avatar
        CH1

        In addition to the points raised in my post above, it should be noted that Autopilot cannot drive the car by itself. The stats Tesla has put out so far are really failed attempts at comparing driver plus Autopilot to driver alone. Even if the stats were adjusted to control for the other variables I mentioned, they still wouldn’t tell how good Autopilot is by itself compared to a human driver.

    • 0 avatar
      JD23

      “Yes, I wholeheartedly believe Teslas data when they say Autopilot will and has saved lives. Whether the customer understands how it’s supposed to work versus actual application is frankly irrelevant. Even a half baked beta test automated driving computer is way better then Average Human Being 1.0.”

      You’re awfully credulous. Scientific investigation involves forming a hypothesis and conducting experiments to support or disprove the hypothesis, not blind faith.

  • avatar
    islander800

    Should it be left up to Musk to play God with lives of citizens?

    He is essentially saying, sure, some people are going to get killed by my imperfect self-driving mode, but it’s all in the greater public good because in the grand scheme of things, my system is going to save lives.

    I don’t think we should be leaving it up to messianic, crusading entrepreneurs to make these choices. If his system can kill people, it isn’t ready for prime time. Full stop. To claim that the deaths caused by his faulty system are acceptable because they are more than offset by lives saved is a monstrous calculus of false equivalencies. The fact that its imperfections lead to deaths of users is enough, standing on its own, to disqualify the system from current public release.

    Just as GM and Ford are held responsible for faulty designs in their vehicles, so should Musk. His statements concerning “the greater good” are irresponsible at the least. He should be held personally responsible for any deaths caused by his autonomous drive system, given his public pronouncements, since he is in effect saying these “sacrifices” are worth it.

    • 0 avatar
      LS1Fan

      In that case, let’s hold Ford and GM responsible for everyone who dies in an accident involving a Ford Mustang, Chevy Corvette, Camaro, or SS.

      Porsche and BMW can join the fun too. Since a car capable of achieving speeds fast enough to kill the operator would constitute a design flaw, they all should be held accountable for that.

      Technological perfection is impossible to achieve. There’s always an idiot better then a system designed to defeat them. Mandating perfection as a condition of public release means the end of technological progress as we know it.

      Without “crusading entrepreneurs” ,we’d still be hunting mammoths with sticks.

      • 0 avatar
        CH1

        “Technological perfection is impossible to achieve. There’s always an idiot better then a system designed to defeat them. Mandating perfection as a condition of public release means the end of technological progress as we know it.”

        False dichotomy wrapped in a straw man. All man-made systems are imperfect, but that doesn’t mean they are equally imperfect or minimally acceptable.

        • 0 avatar
          NickS

          CH1 is spot on.

          Musk is very shrewd. He is a bombastic narcissist who uses the flaws in his product to extract publicity. When people get killed from misuse of a flawed product he doubles down on what a great product he has.

          Tesla needs to be run by an adult.

          • 0 avatar
            tedward

            NickS

            +1
            I’m actually excited by teslas rise on a number of levels, and I’m very intrigued by the way that he launched his product line. However, I think musk himself is a liability to the company from here on out. You need “that guy” to start the company and get the pr and lobbying ball rolling, but life and death product decisions are best left to boring and guarded corporate stooges. There is a very real reason why they exist wherever massive revolving credit lines are found.

        • 0 avatar
          stuki

          But what is minimally acceptable riskiness to a guy who wants to climb into the nosecone of a rocket and light the fuse for Mars, may be very different from what Ralph Nader would settle for.

          So, as long as there are cars on offer which either don’t have a Tesla built autopilot at all, or provide a reasonably reliable off switch for it if they do, there really isn’t much of a problem, is there? The Elons, Bambrogans and battle hardened Seals of the world get to watch movies on the road, while the Naders get to feel smug whenever they see one of them plastered up against a truck side. And, over time, as the technology improves, even Nader gets to benefit from the learnings the Bambrogans sacrificed so much to volunteer as guinea pigs to obtain.

          Chances are, Nader may have been a bit concerned about hopping aboard the contraption of Wright brothers’ fame as well.

          • 0 avatar
            NickS

            stuki, it doesn’t have to be this way for tesla. no-one wants them to fail, myself included. they can fix this, improve upon it, and make it better, without all the drama. This is all Musk and his penchant for controversy. He loves attention, in whatever way he can get it and the negative kind is the easiest to get. He needs to see his name in the headlines.

            Another CEO would have dealt with this in the seriousness it deserves.

  • avatar
    Pch101

    I’ve used this analogy before: If I save three people from a burning building but shoot your mother, my good deeds aren’t going to mitigate what I did to your mother.

    Musk’s rhetoric suggests that he doesn’t get it. The operative paradigm should be aligned with the Hippocratic Oath: Do no harm. Odds are part of the equation, but not the only criteria.

    • 0 avatar
      VoGo

      It’s a crappy analogy. It’s saying that we can’t reduce highway deaths by 30,000 because the cure would kill 10,000.

      I’ll take the saved 20,000 lives.

      • 0 avatar
        Pch101

        It’s a useful analogy, as opposed to your strawman.

        Why would you presume that one must inflict harm in order to save lives? It’s fortunate that doctors have to take an oath that requires them to do otherwise.

        Make a better product, don’t oversell it.

        • 0 avatar
          TriumphDriver

          I think the analogy is rather closer to the truth than you might assume.
          Your mother was probably quite comfortable being in pch101’s company because of the way he saved those people in the terrible fire. Mr. Musk and his friends told her all about that when she agreed to meet pch101. It must have been a big surprise that pch101 turned out not to be such a gentleman and pulled a gun and shot her when the meeting was going so well.

          To continue a little with the analogy, I’m not convinced we have a good basis for assuming that in fact pch101 really did save those three people, they might have been quite capable of walking out of the building on their own. Musk’s claims for the system are not particularly well-supported by sound statistics.
          Is the system being oversold? It depends on who you’re listening to, but there seems to be evidence that some drivers have expected more from the system than it can currently deliver.
          Previous posters have said both that people need to be trained on this technology and that this will not happen. I suspect they are both correct.
          Those who wish to go as fast as possible down this path might want to spend 20 minutes or so watching this video:
          https://www.youtube.com/watch?v=pN41LvuSz10
          There are limits to how far you can compare aviation to automobiles but I suspect the humand factors involved will end up being somewhat similar.

        • 0 avatar
          VoGo

          PCH,
          Chemo kills people. It does. It saves many lives, but some people simply don’t make it.

          Do we ban chemo because some die, or do we weigh the positive against the negative?

          • 0 avatar
            Pch101

            The alternative to chemo under those circumstances is probably death. So as far as your lifespan goes, you don’t have much to lose by using chemo if you are in a situation in which it is being offered to you; it might prolong your life.

            It’s highly unlikely that you will die in a car wreck on a given day. You don’t want to have a technology that just stops working or malfunctions just because, and kills you as a result.

            You have heard of caveat venditor, right? If stuff is produced that doesn’t work or hurts people, then the seller has a problem. Your analogy doesn’t really work — the issue here is of a manufacturer making misleading claims about its products.

          • 0 avatar
            VoGo

            We’re still talking past eachother, which is common on this blog.

            I suspect that if Tesla had named it “Super awesome cruise control” then 99% of the criticism would disappear, and only the truly committed Luddites would oppose “autopilot”

          • 0 avatar
            Pch101

            The FTC has forced companies to change their product names because the names were misleading, such as margarine/ spreads that had names that suggested that they were good for your cardiac health.

            Companies can’t sell defective products or make misleading claims about them. Tesla seems inclined to do both.

            We’re not talking past each other. You simply refuse to acknowledge that Tesla screws up.

          • 0 avatar
            VoGo

            Just to be clear: I am certainly a fan of Tesla; and I also see their errors. Calling their semi-autonomous steering system “autopilot” was stupid. Predicting the release of a new vehicle with serious design challenges before you have them resolved is overly optimistic and lessens your credibility. And focusing on a twitter war over AV when you should be focusing on getting the Model 3 into production is poor use of scarce resources.

            I also suspect that Musks’s ego – often an asset – can be a liability at times. We’re all human, even Ironman.

          • 0 avatar
            Pch101

            In that case, one should accept the notion that innovation does not require killing or injuring ones customers.

            The customers should get a trustworthy, reliable product. If it isn’t at that stage, then it should not be released until it can be tested by professionals who can ensure that it works properly before it is sold to the public.

            There is a difference between selling inadequate gear that crashes your computer and inadequate gear that crashes your car. When legitimate automakers hold back, it isn’t because they are boring morons who are lacking talent and creativity, but because they have to be careful when selling consumer products that can kill people. Tesla likes to claim that it is cutting edge when it is just jumping the gun.

    • 0 avatar
      stuki

      Building a convenience feature (honestly nothing more than a cruise control switch for the born-to-believe-the-hype crowd) that people can choose to use or not, is hardly shooting your mom.

      A more accurate analogy, would be to ban automatic sprinklers systems in skyscrapers, because it just might happen that some mom manages to find a way to drown on account of one. Or, sticking to cars; ban airbags, since, well, Takata.

      • 0 avatar
        Pch101

        The FTC does not allow producers to oversell the benefits of products.

        If Tesla had given it a different name and reduced the level of autonomy, then I wouldn’t be saying anything. You don’t hear me grousing about adaptive cruise control, lane departure systems, etc. because the legitimate automakers aren’t writing checks that can’t be cashed.

        • 0 avatar
          stuki

          Every salesguy on earth oversells his product by the standards of most less enthusiastic observers. It’s a competitive necessity. And not even malevolent. Elon, like most of his kind, is just a glass-ha;f-full kind of guy.

          If Tesla’s target customers were all under the age of 5, or medically incompetent, it could arguably make some sort of sense to assume they were unable to sort the wheat from the chaff. But grownups really ought to be able to make their own judgment calls wrt what is a sales pitch and what is not.

          • 0 avatar
            tedward

            Stuki

            I think it’s easy to agree with your sentiment that adults should understand how these dangerous devices (cars) work and take responsibility for that. On the other hand we all have an aunt Ethel or uncle Bob that we just know won’t do that. To be fair to the ethels and bobs of the world no one is raised to fear the beast anymore. We only know this stuff because we love the beast ourselves.

        • 0 avatar
          285exp

          It’s interesting how you blame everyone except the driver, who was not an unsophisticated user. He knew the limitations of the system, and even stated in the comments to one of his videos that it was the responsibility of the driver to monitor the situation and be prepared to take control if it failed to respond to hazards. Yet he didn’t.

          That’s not to completely absolve Musk of overselling autopilot, he even tweeted out this guys video, the one where he didn’t kill himself, bragging about how well it worked. They make sure they have all the nag screens they need to cover their backsides legally when people like Brown misuse the system, but lack of training isn’t what caused him to over rely on the system, it was over-confidence and poor judgment.

    • 0 avatar
      WheelMcCoy

      When seat belts were first introduced, they saved lives. But at high speeds, some people got internal injuries over their stomach. Three-point belts addressed that. And Mercedes is working on an inflatable three point belt that reduces chest injuries.

      When ABS came out, they saved lives. But some people didn’t know they had to stand on the brakes and that it would pulse. Old school drivers pumped their brakes, and needed retraining.

      When airbags were introduced, they saved lives, Takata issue not withstanding. In some cases, people broke their nose. In another, a young lady broke her finger because she was stupidly holding a phone. Now modern airbags have multi-stage deployment based on the weight in seat.

      Tesla’s autopilot can save lives, but can also unintentionally injure or kill people. It’s not perfect, but neither are the safety features above. I don’t like Musk’s juvenile and undisciplined behavior, but maybe it’s what’s needed to build disruptive technology.

      • 0 avatar
        Pch101

        There’s no comparison because nobody was lying about what their seat belts could do.

        Musk wants to have the luxury of hyping his product when it suits him, only to rely upon the warning labels when the hype backfires. Sorry, but you don’t get to have it both ways.

      • 0 avatar
        brandloyalty

        I find it inexplicable that ABS advice never mentions extremely slippery roads. If you hammer the brakes on very slippery surfaces, all four wheels stop rotating at the same time. The vehicle systems interpret this as being stopped, and it makes sense to not release the brakes when stopped. So the brakes are not released and the vehicle skids/slides out of control.

        Why does this seem such a mystery when, even if one hasn’t experienced it for themselves, everyone has seen videos of cars that presumably have ABS, sliding all over the place and great distances on icy roads?

        So the old-fashioned advice to pump the brakes still holds in the circumstance of extremely slippery conditions.

        • 0 avatar
          WheelMcCoy

          You’re right about ABS. I intentionally tested them on a wintry day on a slight decline. With ABS, the car shuddered — relatively violently and sloppily — and stopped. Circling around, I repeated the test, moderating the brakes before the point of wheel lockup. The car stopped smoothly and earlier.

          ABS, however, does do a better job if I have to swerve to avoid something.

          Even with the test under my belt, in an emergency, I doubt I would have the where-with-all to adjust my braking to suit the situation. Hence, the standard advice of standing on the brakes with ABS.

        • 0 avatar
          stuki

          Newer ABS systems take deceleration into account as well. Slam on the brakes at 60, and even if all your wheels lock up, the car knows you didn’t do 60-0 in 0. At very low speeds, what you say may still hold true.

      • 0 avatar
        DC Bruce

        I’m not seeing how autopilot can save lives, unless you assume the existence of an extremely impaired or incompetent driver. And, even then, as the truck crash shows, it doesn’t always work (and I’m assuming for the sake of discussion, that the Tesla driver was, in fact, watching a Harry Potter video.)

        The problem with the use of the term “autopilot” for Tesla’s system is that, to the ordinary person, it suggests that the car can drive itself –at least to some degree — freeing up the driver’s attention for doing other things. That appears to be what happened with the guy who T-boned the truck.

        • 0 avatar
          WheelMcCoy

          @DC Bruce – “I’m not seeing how autopilot can save lives, unless you assume the existence of an extremely impaired or incompetent driver.”

          I can see autopilot helping if the driver passes out or has a heart attack. Or, in a heavy downpour, autopilot can see the lane markers where a human would struggle. In thick fog, autopilot’s radar can detect vehicles ahead while a human would be almost driving blind. Lastly, maybe the driver had a long day, is fatigued and cranky, and knows better not to drive angry. An R2D2 droid pilot would be most welcome.

          That said, I agree with you and many others who think auto-pilot is a poor name for the current state of the tech. It’s really still driver assist.

          • 0 avatar
            CH1

            @WheelMcCoy – Autopilot in its current form can only be used under good weather and lighting conditions. It requires the driver to be fully alert and ready to take control in an instant.

            Manual steering input deactivates Autopilot. A drunk or impaired driver will very likely deactivate the system early on because they lack the control to hold the steering wheel without steering.

            The crash prevention scenarios you mentioned are really only applicable to fully autonomous systems.

            Current Autopilot is just ACC plus lane keeping assist. ACC has no significant safety benefits. Accidents due to unintentional lane departure are only 2.4% of all crashes. Furthermore, Tesla models are fitted with a lane keeping aid that’s active even when Autopilot is not is use.

            The bottom line is the potential safety benefits of the current Autopilot system are quite small.

      • 0 avatar
        JimZ

        “And Mercedes is working on an inflatable three point belt that reduces chest injuries.”

        which puts them three years behind Ford.

  • avatar
    Corollaman

    Hard to believe that there are people out there who fall for this Tesla “autopilot” BS hook and sinker. Plus they get mad at those who say it’s just a bunch of crap.

  • avatar
    EBFlex

    Musk is such a con man.

    His entire company is a house of cards producing low quality, electric fashion accessories that kill people.

    God forbid the company makes a profit. I guess profit and sustainability are less important than “autopilot”

    • 0 avatar
      tedward

      Con man might be strong. I agree with the negative sentiment, but guys with blinders on really can drive those around them to useful extremes. That’s probably exactly what was needed to actually create a new and growing domestic auto brand.

      The problem with guys like this is that if they aren’t money bags with a vision they are exactly who you don’t want to share a work space with. They need to be the boss, or otherwise actually be the guy who makes money off of one of those pyramid scheme sales operations. I think we’ve probably all met a few.

      • 0 avatar
        JimZ

        “con man” is definitely unfair. I think he’s just too “cavalier.” There’s very good reasons he had experienced engineers advising caution, and that’s because cars aren’t smartphones or Paypal. There’s only a small set of things you can fix on a car via an over-the-air update. But there’s a much larger set of things on a car which can fail and harm someone which can’t be fixed remotely. I get the idea there are a lot of people within Tesla who don’t quite get that latter part.

    • 0 avatar
      cornellier

      So much hate. EBFlex have you ever driven or even been in a Tesla?

      • 0 avatar
        EBFlex

        Me riding in one of these electric fashion accessories will not change the fact that Musk is a con man, he cannot produce a profit, and autopilot is extremely dangerous and any responsible automaker would not let their customers beta test such a system.

        It was bad enough when Ford had their customers beta testing My Ford Touch but this level of carelessness and recklessness is unfathomable.

      • 0 avatar
        JimZ

        I’ve been in two. I won’t be nearly as vicious, but I did get the impression they weren’t really that well screwed together. they rattled and squeaked like crazy inside, and there were various (different) trim misalignments throughout.

        but, the car does something no one else does, which overshadows such things.

  • avatar
    cornellier

    Look at the TTAC article that followed this one, red light cameras. If I understood correctly, it’s being put forward that it’s “unamerican” to be told what to do at an intersection. I for one would rather negotiate an intersection with an autodrive Tesla than with the Brodozing plebs.

    • 0 avatar
      EBFlex

      Ignorance is bliss I guess…

    • 0 avatar
      SP

      So when the sleeping drunkard plows through the intersection and kills 5 people including himself, will you feel better because a ticket was mailed to his former address?

      (Or when the texter ignores the “NO TURN ON RED” signs and pulls out into traffic, will that ticket make you feel better?)

      Because that’s pretty much how red light cameras work. They don’t stop accidents from impaired or inattentive drivers, because those drivers won’t notice or care that a red light camera exists. They can only encourage alert drivers to be more alert, through threats of harsh financial punishments.

  • avatar
    brandloyalty

    Certainly seatbelts and airbags are accepted by any reasonable person despite the fact some will be injured or killed by seatbelts and airbags in circumstances where they would otherwise be unscathed or survive. In this context, the question is how many lives are saved by current automotive autopilot systems compared to how many die. Implicit to the question is how positive is this ratio. CH1 has addressed this question earlier.

    When first exposed to them, I thought backup sensors were superior to reverse cams. Reverse cams cause you to use your eyes to look at the monitor, while you forget about front end swing and overall vehicle positioning. Using one’s ears to monitor backup sensors frees the eyes for these other tasks. Then I realized that backup sensors won’t protect you from backing into a ditch or over a cliff. This sounds like a problem with the Tesla self-parking. You’d think it would be easy to fix. Even old Roombas are smart enough to not fall down stairs.

  • avatar
    DAC17

    Sounds to me an awful lot like the VW scandal. Various employees say something can’t be done, but overbearing boss doesn’t want to hear about it. Of course, Musk seems to get a bye for almost everything, because of the company’s “green cred”, so maybe he’ll skate through this one.

  • avatar
    VoGo

    If anyone read a credible news source* on the issue, they would see that the Tesla autopilot feature had nothing to do with the deadly crash. Autopilot doesn’t control the brakes, only the steering.

    Which means that 90% of the comments on this and all the other related articles are completely wrong.

    *which could include NYT, CNBC, UPI,…

    • 0 avatar
      Pch101

      Tesla is considering whether the radar and camera input for the vehicle’s ***automatic emergency braking system*** failed to detect the truck trailer or the automatic braking system’s radar may have detected the trailer but discounted this input as part of a design to “tune out” structures such as bridges to avoid triggering false braking…

      http://www.reuters.com/article/us-tesla-autopilot-congress-idUSKCN10928F

      • 0 avatar
        DenverMike

        There’s the invisible trailer, but the tractor plus landing-gear crossed its path a 1/2 second before impact. Were they invisible too. Or it calculated a ‘miss’, along with the trailer’s dual tandems?

      • 0 avatar
        mcs

        Just speculating, but Mobileye is a monovision system. Monovision doesn’t cut it. A stereo-camera based system could have calculated the distance between the ground and the bottom of the trailer and realized it couldn’t drive under it. I also don’t buy the white trailer against the white sky theory. The system should have seen the tires of the truck. The space between the bottom of the trailer and the ground would have shown up as well. I do believe the theory that it thought it was a highway sign or bridge – a monovision system will make those kinds of mistakes.

    • 0 avatar
      Kenmore

      “Autopilot doesn’t control the brakes, only the steering.”

      Then given that “piloting” a car can reasonably be expected to involve an occasional necessary use of the brakes, how is Autopilot not a criminally egregious misnomer?

    • 0 avatar
      tedward

      C’mon vogo. If you were to substitute “suite of driver assist programs” for autopilot it would be fine. Tesla choosing to spin off autopilot from the rest in their terminology is a difference that should only matter to tesla. Besides, if anything it shows how misleading a name like autopilot is in the first place.

    • 0 avatar
      accord1999

      “Autopilot doesn’t control the brakes, only the steering.”

      Sure it does. On their website:

      “Autopilot enables Model S to match your speed to traffic conditions” which requires braking.

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Lou_BC: @Carlson Fan – My ’68 has 2.75:1 rear end. It buries the speedo needle. It came stock with the...
  • theflyersfan: Inside the Chicago Loop and up Lakeshore Drive rivals any great city in the world. The beauty of the...
  • A Scientist: When I was a teenager in the mid 90’s you could have one of these rolling s-boxes for a case of...
  • Mike Beranek: You should expand your knowledge base, clearly it’s insufficient. The race isn’t in...
  • Mike Beranek: ^^THIS^^ Chicago is FOX’s whipping boy because it makes Illinois a progressive bastion in the...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber