By on July 17, 2015

Google-Car-645x364

Three people were injured when a car rear-ended Google’s self-driving Lexus on July 1 in Mountain View, California, The Detroit Bureau is reporting. It’s the 15th crash for the self-driving car and the first with injuries.

Three people had “minor whiplash” Google’s Director of Driverless Cars Chris Urmson wrote and the driver of the car that rear-ended the Lexus appeared to be at fault.

“Our self-driving cars are being hit surprisingly often by other drivers who are distracted and not paying attention to the road,” he wrote.

The robots will not look kindly on our inattention.

According to Google’s monthly report, the fleet of autonomous cars has traveled more than 1 million miles without human piloting, and the cars are averaging around 10,000 miles traveled each week.

Google says the autonomous cars have not been at fault in any of its 15 recorded accidents so far. Testers say the vehicles are being crashed into at a higher rate than normal due to under-reported accident numbers.

Earlier this month, Google sent two Lexus RX450h vehicles to Austin, Texas for mapping and testing.

Get the latest TTAC e-Newsletter!

Recommended

86 Comments on “Google’s Robot Car Crashed, Humans At Fault...”


  • avatar
    FAHRVERGNUGEN

    Great. All we need now are robot lawyers and robot juries.

  • avatar
    APaGttH

    /tinfoil hat on

    Would it be a stretch to say that Google bolsters its argument every time a distracted driver crashes into one of their robot cars.

    /tinfoil hat off

    • 0 avatar
      CJinSD

      15 crashes in a million miles sounds like a pretty high number. Human drivers average 185 crashes per 100 million vehicle miles traveled. That’s 1.85 crashes per million miles. Google’s autonomous cars are 8.1 times as likely to crash as human driven cars, no matter who Google wants to blame.

      http://www.caranddriver.com/features/safety-in-numbers-charting-traffic-safety-and-fatality-data

      • 0 avatar
        Lack Thereof

        False comparison.

        Per-mile accident rates are massively higher in city driving than in highway driving.

        You can’t compare the national average accident rate of humans driving mostly on freeways, to the accident rate of autonomous cars that spend all day navigating city streets.

        Find some more specific accident statistics and try again.

        • 0 avatar
          CJinSD

          You’re making up strawmen. You know nothing of the average driver’s driving cycle and nothing about Google’s rolling accident scenes’ driving cycles, and if you do you’ve done nothing to support your assertion.

          • 0 avatar
            ckb

            “You know nothing of the average driver’s driving cycle and nothing about Google’s rolling accident scenes’ driving cycles, and if you do you’ve done nothing to support your assertion.”

            Do you? You asserted the google cars match the average driver well enough for a direct comparison. If that’s true they’ve done a hell of a job recreating the experiences of 150M drivers with just a dozen cars.

          • 0 avatar
            Pch101

            It’s no secret that freeways/motorways are the safest road design that we have. No cross-traffic, few head-on crashes, with clear lines of sight and few curves. That’s one reason why we build them: there are fewer opportunities to hit each other and to lose control.

          • 0 avatar
            CJinSD

            And how do you know what the percentage of overall miles recorded on limited access highways are compared to the percentage of Google’s miles recorded on limited access highways?

          • 0 avatar
            Pch101

            What we know is that the Google cars have had no liability in the crashes.

          • 0 avatar
            CJinSD

            We left coincidence behind almost a dozen collisions ago. If the average autonomous car has a claim, and so far it would, then fault won’t save them from astronomical rates.

          • 0 avatar
            Pch101

            Google has had 14 crashes over 1.8 million miles

            In the Virginia Tech naturalistic study, in which 100 cars had all of their activity recorded, the drivers covered 2 million miles and had 82 crashes, of which only 15 were reported to the cops.

            I’m sure that this data will not cause you to rethink your position, since you never thought about it in the first place.

          • 0 avatar
            CJinSD

            Only 1,057,962 miles have been covered in autonomous mode, and nowhere does it say that all of them were on public highways. The 15 crashes we know about were on public highways though, so the truth is probably much worse than has been reported.

            As for VTTI, here’s what they said about your 82 crashes: “15 police-reported and 67 non-police-reported crashes which included low “g” events such as struck or ran over curbs and parking blocks (some vehicles involved in multiple crashes)”

            In other words, they had half the accident frequency as Google’s autonomous cars in a best case scenario. Realistically, lots of those 1 million miles were racked up on closed test tracks, making the accident rate on public roads even higher. Would Google report accidents that happened outside the public eye? Maybe, maybe not.

          • 0 avatar
            Pch101

            Your inability to understand stuff that you read explains a lot.

          • 0 avatar
            Lack Thereof

            If you read the articles linked in the post, you’d know that Google is averaging “10,000 self-driven miles a week, mostly on city streets.”

            You would also see mentions of the high incidence of accidents on city streets vs. highways… something that everyone is taught in Driver’s Ed (or at least was 20 years ago), is frequently mentioned in wear-your-seatbelt PSA’s, and is all over insurance companies’ safe-driving promotional materials.

            I wouldn’t think I’d need to provide that info.
            But you don’t need to read the linked materials to troll, so why bother? Reading the information would slow down your post rate.

      • 0 avatar
        WheelMcCoy

        In the Google’s director link, the article mentioned crashes are calculated from police reports, so the actual number of crashes are under-reported. There are a lot more fender benders happening.

        The 15 crashes into Google’s cars were *not* on police reports. If one were to go by police reports alone, then crashes involving Google autonomous vehicles would be 0.

        That said, I know at least 2 drivers who tend to leave their trunk mounted bicycle racks on, even when they aren’t carrying bikes. That seems to force other drivers — even inattentive ones — to maintain a respectful distance.

        • 0 avatar
          Pch101

          The Virginia Tech naturalistic study was revealing because it could capture everything. And that study made it quite clear that most incidents are never reported to the police (which shouldn’t surprise anyone.)

  • avatar
    Pch101

    There is a lot of data on serious crashes. We know quite a bit about fatals because they are investigated.

    The minor brushes are surely under-reported by wide margins. The little things do not involve the police and may not even involve insurance companies.

    The NHTSA-Virginia Tech naturalistic study was interesting because everything that the participants did was documented. As it turns out, some people have a real talent for minor crashes and near-misses.

  • avatar
    28-Cars-Later

    Those pesky humans are always causing trouble, why do we put up with them again?

  • avatar
    SCE to AUX

    I don’t see how autonomous cars can succeed:

    1. If the mfrs are held responsible for crashes, they will never build them.

    2. If the occupants are held responsible for crashes, they will never buy them.

    • 0 avatar
      Lack Thereof

      The solution is to make sure autonomous cars do not cause crashes. Then there will never be a liability problem.

      • 0 avatar
        ckb

        “The solution is to make sure autonomous cars do not cause crashes.”

        So far so good. Of all the stories I’ve seen with some variation of “Autonomous car crashes X times” in the headline, the second paragraph always reveals it was a human at fault.

        Case in point, this story’s headline should read “Human driver crashes into Google Robot car – Again” instead of making it sound like the car’s fault.

    • 0 avatar
      Prado

      Insurance rates. That will be a major determining factor in success. If it comes down to self driving cars being a lot less likely to be liable for an accident that a human, the rates will reflect that and people will gladly move over to the passenger seat to save a few hundred a year.

      • 0 avatar
        CJinSD

        Insurance companies aren’t that clueless. Even if simple fault is assigned to the other driver, their statistical analysis will tell them that there is something about the autonomous cars that has a causal relationship to an accident rate over eight times the norm.

        • 0 avatar
          jmo

          Yes, but that will just increase the rates on non autonomous cars.

          • 0 avatar
            CJinSD

            Not so. Insurance companies look at average claims by vehicle. Fault is only a factor in assigned risk. If you insure a car that has 8 times the claims expenses as any other car, your rate will reflect that, no matter who paid the claims on similar cars. All insurance companies care about is what costs them money, and these cars look likely to cost insurance companies unprecedented sums.

        • 0 avatar
          Pch101

          You need to go to actuary school.

      • 0 avatar
        319583076

        The price of man in motion is the occasional collision – the only way to ensure no crashes is to eliminate motion. Good luck.

    • 0 avatar
      Pch101

      Point number 1 is the essence of the problem: Why would OEMs want to own the liability for crashes?

      • 0 avatar
        jmo

        Because they can do so profitably?

        Adverse selection and all that and it might cost $250/year to insure a self driving var and $2500/year to insure a human driven car. Increase the price of the autonomous car to pay for offloading the liability onto a consortium of insurers and you’re good to go.

        • 0 avatar
          Pch101

          As an OEM, I would want to own zero liability.

          There will be crashes. They get absolutely nothing from taking responsibility for them.

          • 0 avatar
            jmo

            Why?

            If you can offload it onto Swiss Re at a profit?

          • 0 avatar
            jmo

            “They get absolutely nothing from taking responsibility for them.”

            Higher selling prices as the cost to insure a self driving car, as a consumer is $50/month, and the cost to insure a traditional car is $250/month. So, they can sell their cars for at least $12k more.

          • 0 avatar
            Pch101

            It’s a matter of marginal cost. They’re increasing their costs without any benefit. They aren’t going to sell any more cars or be able to charge any more for them.

            In any case, this sounds like a pipedream. I suspect that this will turn into a form of super cruise control. A sober licensed driver will still need to be behind the wheel and will be expected to intervene when the systems fail, as they will on occasion.

          • 0 avatar
            jmo

            “or be able to charge any more for them.”

            Sure they can, they can charge more for them because the cost to insure them will be so much lower. Most people are only looking at the monthly cost anyway.

          • 0 avatar
            Pch101

            You’re really going out on a limb with that one.

            This technology, if it works, will become a commodity in no time. And as noted, there is not much reason to expect that drivers will ever be let off the hook for liability — if your car strikes another, “wasn’t me” won’t cut it as a defense.

  • avatar
    DenverMike

    Robots can’t calculate risk like making a legal turn where the flow of traffic from behind has to come to a stop for you. No law against abrupt stops either. I know a gal that was rear ended 5 times in ten years of driving, permanently disabling her, but never her fault.

    • 0 avatar
      ckb

      Sure they can. Just use the rear facing camera/radar thats probably already on the car to calculate the speed of drivers behind like they probably already do, add in the estimated reaction time and there’s your calculation. Seems its your friend that can’t calculate that risk.

    • 0 avatar
      ajla

      That’s an interesting point.

      If someone is tailgating me I can signal earlier and brake lighter/earlier before a turn. And if someone is drafting like Talledega I can cancel the turn altogether. Can the Google car do the same things? I assume it has some way to sense the situation behind it.

      • 0 avatar
        Pch101

        If you had a world full of autonomous cars, then there would be no tailgating.

      • 0 avatar
        brickgeek

        It can certainly sense that. Looking at the sensor visualizations from the linked stories, it can be seen that they are building a 360′ model of the environment around the car.

        It would be trivial to determine the following distance of the vehicle behind you and react accordingly. Not only that, but I expect that they would be able to determine vehicle mass and reaction speed accurately based on the past behavior of the following vehicle. As such, they could probably calculate in a conservative stop distance parameter for the following vehicle. All of this would be fed into the decision of whether to make the turn. That decision is probably reverified and altered as needed many times a second.

        • 0 avatar
          DenverMike

          You may be right. And that’s way more calculations that I could ever do. Except I’ve passed the million mile marker many years ago. Yet nothing’s come close to nailing me from behind. A lot of near misses in most other types of accidents though. Except I have much less control over what’s coming at me from the sides and head on.

          • 0 avatar
            krhodes1

            @Denver Mike

            You are just lucky, not skilled.

            About 10 years ago I turned onto my street to find a cop with a car pulled over on my side of the road about 1/8th mile from the intersection. There was oncoming traffic, so I stopped a couple car lengths behind him, as the cop was half in my lane as they generally do to protect themselves. While sitting there, a teenage girl turned onto the street drove the 1/8th mile straight into the back of my Saab. Which mind you, had the hazard lights on, and was sitting two car lengths from a fully marked and blue lights flashing city cop car. “She didn’t see me”. Uh, she obviously wasn’t paying the slightest bit of attention to what she was doing.

          • 0 avatar
            DenverMike

            I’d say “luck” plays a minor role. I know I watch what’s going on behind me, sometimes more than what’s ahead. Maybe I would’ve been hit by that teen driver. Probably I would’ve steered it into the ditch instead. I try not to put myself in “harm’s way” as much as possible and leave a ‘gap’ enough between me and what’s in front of me, to jump out of the way if I’m about to be slammed. I’m always thinking that way. Call me paranoid.

            Yesterday on the freeway, the slow lane was coming to a stop from a backed up off-ramp. Not my problem, I’m traveling in the 2nd lane in, (#4) but my ‘spidey sence’ was *off the charts*. I start to move over to the #3 as I hear a car sliding (in the rain) in a full panic stop. I check my rearview once in the #3 and I was being passed (?!) on the right by a minivan not seeing his/her lane was stopped. The van was sideways and had managed to stop barely in time. If there was an accident and I was still in the #4, I could’ve been caught up in it.

            I just go out of my way to reduce risk. Obsessed by it really.

          • 0 avatar
            krhodes1

            No ditch. Granite curb to the side – you do NOT drive into those here, they will shred your tires, and too high to go over in a Saab anyway. Oncoming traffic in the other lane. Cop car dead ahead. I pulled forward a car length when I saw her coming, but it didn’t do much good. Sometimes you are just the target… Only $2K in damage, and it was a salvage title car anyway, so not that big a deal.

  • avatar
    Dan R

    I never get tired of Google’s regulatory and legal naivete.
    Just another window dressing project.

  • avatar
    RideHeight

    Smart kid in a bad school syndrome.

    Look out when the robots start breaking bad.

  • avatar
    White Shadow

    I can often influence the behavior and/or driving style of a car behind me simply be changing my own driving style according to what I see happening in my rear view mirror. Can an autonomous car do the same?

    Seriously, I’m almost always very aware of my surroundings while driving and I’m always scanning my rear view for that very reason. It’s part of being a defensive driver.

    And FWIW (knock on wood) I’ve never been rear-ended in my 30 years of driving.

    • 0 avatar
      Motorhead10

      ‘Seriously, I’m almost always very aware of my surroundings while driving and I’m always scanning my rear view for that very reason. It’s part of being a defensive driver.’

      Not trying to cause trouble – but where you are ‘almost always very aware’ (and even if we are ALWAYS aware) The Machines are capable of seeing 360 degrees, for a greater distance, every millisecond of the time the vehicle is powered on. As V2X capabilities increase (and they will) and adoption proliferates, the amount of data processed will far exceed what a human can do. And the machines don’t get distracted, tired, frustrated, hungry, angry, or reckless.

      I love my all cars and driving in any form is pleasure for me. But I personally believe the human is the problem. When driving, I am always trying my best to be safe and responsible, and stuff happens anyway. Most other drivers I see aren’t even trying.

      So if you ask ‘can an autonomous car do the same?’ I say – No. It will do it much better.

      • 0 avatar
        SCE to AUX

        “Listen, and understand. That terminator is out there. It can’t be bargained with. It can’t be reasoned with. It doesn’t feel pity, or remorse, or fear. And it absolutely will not stop, ever, until you are dead.”

        -Kyle Reese

        • 0 avatar
          brickgeek

          Or perhaps more pointedly:

          “Watching John with the machine, it was suddenly so clear. The Terminator would never stop. It would never leave him. It would never hurt him, never shout at him, or get drunk and hit him, or say it was too busy to spend time with him. It would always be there. And it would die to protect him. Of all the would-be fathers who came and went over the years, this thing, this machine was the only one that measured up. In an insane world, it was the sanest choice.”

          -Sarah Connor

      • 0 avatar
        White Shadow

        I was talking about today….not the future.

    • 0 avatar
      Pch101

      Roads full of autonomous cars won’t do the dumb things that people do.

      Assuming that they work, their main advantage is that they won’t behave like humans. That’s a good thing.

      • 0 avatar
        ClutchCarGo

        “Roads full of autonomous cars won’t do the dumb things that people do.”

        And I suspect that any possible higher incidence of accidents/mile for autonomous cars is due to this fact. Humans operating cars keep acting as though the cars around them are operated just as irrationally as their own cars, while the autonomous cars operate rationally.

        • 0 avatar
          Pch101

          The numbers would indicate that they crash less than average.

          And one would expect that a device or human who has 0% responsibility for the wrecks in which it/he/she was involved would be rear-ended much of the time, as that’s the one type of crash for which the striking party is almost always at fault and for which the struck vehicle almost always has zero fault.

          The Google car does not behave in ways that it would get it involved in very many other types of crashes. It isn’t a human that is distracted, tired, irritated, in a hurry, embittered, involved in an argument or drunk.

    • 0 avatar
      krhodes1

      @White Shadow

      You are an aberration. The average driver might as well not even have rear-view mirrors for all the use they get.

      What seems to get missed around here is that self-driving cars don’t need to be perfect, they just need to be better than the average moron with a license. That is a SPECTACULARLY low standard. I drove in heavy summer weekend traffic home to Westbrook, ME from my office in Waltham, MA this morning. The amount of stupidity I saw in 2.5hrs makes me wonder how people manage to not just crash constantly. I’m not perfect, but at least I attempt to pay attention to what I am doing behind the wheel. Many don’t even try.

      The sooner self-driving cars get here the better. I will then take my Spitfire out on some lonely back roads in Maine when I want a manual driving fix, and do something more productive with my time when I have to slog home from the office. I stayed over an extra night rather than deal with FRIDAY summer traffic – Google was predicting 3.5hrs for the 129 miles home at 4pm yesterday. F that noise…

  • avatar
    mcs

    As Google starts cost-cutting, the autonomous car project will probably come under the scrutiny of the accountants. I think the program is on thin ice. How long before this thing makes money for them? Auto-pilot type features are already trickling into the market from various auto manufacturers – so no sale there. Full autonomy is much further out and automakers may very well arrive there first. It’s going to be a though sell to Ruth Porat to keep the project going.

    http://www.wsj.com/articles/google-takes-stricter-approach-to-costs-1436827885

  • avatar
    Volt 230

    The next version of the Google robo car will have defensive bazookas installed in the rear, and will fire upon a car that is approaching too fast.

  • avatar
    TheEndlessEnigma

    I find it interesting these Google cars are being rear ended, what did the car do to contribute to this?

    When was the last time fault for an accident that involved one car rear ending another was assigned to the drive that was hit? *VERY* rare yet the actions of the driver, or computer, of the car that is hit can set up the situation that leads to the accident.

    Example, and this happened to me. I am driving on a road marked 50 MPH, I was about 5-6 car lengths behind a Ford Explorer driving at the same speed. Along this stretch of road there are no drive way, side roads, safety shoulders, etc. and clear shoulder with approx a 50 yard clear view on both sides of the road, you may see where this is leading. The driver of the vehicle ahead of me, out of the blue, stands on their breaks coming to a dead stop in the road – I hit the Explorer while attempting to stop and maneuver to avoid impact. Turns out the driver of the Explorer did what they did because the dropped their cell phone and it landed on the floor board by their feet – on trying to reach over and pick it up they locked up their brakes. I was doing the speed limit, and following at a safe distance but I was assigned responsibility for the accident and issued a citation for reckless driving.

    I fought this in court and the judge changed the accident judgement to “no fault” tagging both of our insurances. However my point is this, it is very very very very very rare that anyone other than the driver hitting the car in front of them is assigned responsibility for the accident. Just because blame has not been assigned to Google does not mean their cars haven’t set up the conditions to create the collisions.

    • 0 avatar
      RideHeight

      As an old guy I admire your ability to stretch like that.

    • 0 avatar
      rpn453

      It was reasonable that both drivers share fault in that situation given that the other driver was stupid enough to admit that they slowed for something other than an animal crossing the road, but you really should make a habit of using more than 1.3 seconds of following distance. I would not feel safe driving that close behind someone.

      • 0 avatar
        RideHeight

        I think it’s reprehensible that with good road conditions the rearendee was held even partially responsible.

        • 0 avatar
          rpn453

          I think TheEndlessEnigma deserves most of the fault but I think there may be some contributory negligence involved. I’d at least ticket the Explorer driver for driving without due care and attention.

          We can probably at least agree that the “no fault” judgement is reprehensible.

          • 0 avatar
            RideHeight

            OK, you’ve done a better job of identifying my core concern. Assignment of fault in good road conditions should always go to the driver who did the rear-ending.

            That the car in front did a meathead maneuver should be a separate and distinct judgment.

          • 0 avatar
            Pch101

            You’re always supposed to maintain enough braking distance so that you can stop. If you hit it, you own it.

    • 0 avatar
      Pch101

      “I was…following at a safe distance”

      If you had been following him at a safe distance, then you wouldn’t have hit him.

      It really doesn’t matter why he stopped. You’re still supposed to have enough braking distance to react to locking brakes ahead of you.

    • 0 avatar
      Counterpoint

      You got lucky with that judge. It was your fault. Leave enough space to stop next time.

      • 0 avatar
        Pch101

        He really did get lucky. Even though the other driver was not a model driver and probably violated a separate statute that does not allow one to stop unsafely, the lead car that braked had 0% fault for the crash.

  • avatar
    wmba

    Personally,I wouldn’t be surprised if the Google cars drive like Granny in her Corolla, because no doubt they obey all the actual rules programmed into them to a T and drive a mph or two under the limit, thereby generating single file backed-up traffic on two lane roads.

    Most of us have seen such behavior and fumed. Some people try to pass anyway, sometimes with bad consequences. Hence the rule in our Highway Code to pull over when traffic builds behind you. Of course, nobody pays any attention to that rule because nobody can remember it, or the one that forbids riding bicycles on the sidewalk for that matter, so will Google? At least they are actually continuously looking in the “rear-view mirror” and could implement it.

    Google blames the accidents on humans in other cars, but offers no proof that their procedures don’t affect the way other people react to them. That would indeed be bad for business, so analysis of that type is studiously avoided. No, it was just human error so far as they are concerned. And Granny hung up her reins at 86 never having caused an accident either, while leaving mayhem behind her everywhere she went. We have such a lady some doors down the street. A local legend.

    It’s fine to granny-drive if everyone else also drove like a neutered drone, but unlikely to happen for however many decades it will take for this autonomous stuff to come to full fruition. Until then these robots may well impede traffic rather than assist flow.

    At the point of full automation, drivers will still have to take over and drive when the car signals them to, and being completely out-of-practice will be useless at the skill anyway. We’d better have vehicles where tires never blow out, driving “Apps” never require updating, and electronic failures never occur. As if.

    Of course, if Pch101 is to be believed, and a fine public service technocrat he would be, the driving schools/education of today are useless because they don’t turn out better drivers than just letting any-old-body have a go with instructor Uncle Fred holding forth on the basics. That’s the statistic. Whatever.

    For me, autonomous cars spell the end of differentiation between models and manufacturers. A plastic acrylic bubble with crap suspension and lousy brakes will suffice as it blindly reads the terrain, treating each drive as if it were the first one on that route. No need for traffic updates, but maybe the V2V will report on potholes. Whoop-de-do. Driving in winter? Not allowed.

    If ever there was a potential societal shift like this which seems to have had so little brainpower expended as to consequences of all sorts, but which the great multitude of the common lesser spotted prole has issued a great hurrah! for, I’m not aware of it.

    • 0 avatar
      chris724

      I thought they were limited to 25mph by law? Glad they’re not clogging up the roads around here.

    • 0 avatar
      Pch101

      Driving schools can teach you how to steer and other basic mechanical skills if you don’t already know how to do those things.

      Driving schools can’t teach you to not want to tailgate, get drunk, be overconfident or act aggressively. Bad driving is often a function of apathy, drowsiness, bad attitudes and/or intoxication, not a lack of talent for steering and braking. The autonomous car has the benefit of not acting like an idiot.

    • 0 avatar
      RideHeight

      “they obey all the actual rules programmed into them to a T and drive a mph or two under the limit, thereby generating single file backed-up traffic”

      This!

      They had it coming. Goody two-shoes bastards.

  • avatar
    ixim

    I don’t agree that most human drivers are bad. Most of us do at least one dumb thing every time we drive. Most of those mistakes/errors in judgment/failure to understand the road lead to zero consequences. These autonomous Google pods may have some utility as rather inelegant means of mass transit for commuters. But how will they meet the needs for anytime anywhere freedom of movement that the personal motor vehicle offers? Not to mention the pleasure of operating those machines? Cue the conspiracy theories starring Google/insurance companies/misguided politicians/corporate ogres.

  • avatar
    rudiger

    “The robots will not look kindly on our inattention.”

    And I, for one, welcome our new electro-mechanical overlords.

    I am quite curious how they’ll do at the helm of that Viper ACR that Baruth recently tested…

  • avatar
    ja-gti

    I figure autonomous cars will need electronics that function perfectly to drive perfectly.

    I doubt that the hardware will be able to perform as needed if it requires any active maintenance from the owner.

    My current 60k mile vehicle has had the automatic climate control go haywire (blowing max hot at max fan in the middle of the summer) and my automatic transmission wouldn’t shift past fourth because an ABS sensor ring has corroded. If this car was autonomous, it would have defaulted to stop mode (safety first!) and left me stranded who knows where. Yeah technology!

    Anybody have a computer that’s working as well now as it was ten years ago? Cause that’s what autonomous cars’ level of performance will have to be. After sitting outside all those years. In the winter.

  • avatar
    05lgt

    Do the self driving cars do what we do and try to take the car behind us’s likely reaction into account when driving? I brake hard early than reduce braking to wake up the car behind me when faced with unexpected slowing. Is that behavior built into these auto drivers? It’s nice to declare results you don’t like as not valid because someone else’s results are not accurate, unless you are actually trying to get better. This is disappointing.

    • 0 avatar
      sgeffe

      My Accord Touring (in the avatar) is equipped with Adaptive Cruise Control, and while it works as intended, some elements of it show some “v1.0-ness,” primarily, aggressive braking when a slower car cuts in. You have to have some awareness of what’s behind you, and give just enough throttle to override the ACC to gradually slow the car.

      I’ve heard that Collision-Mitigation Braking (autobrake) systems sometimes misread things as well, so in traffic, if your car is so-equipped, you may have to give the throttle the same nudge to override the braking.

      Imagine..another reason to hang up and drive! ;-)

  • avatar
    doublechili

    “Which is safer” discussions are pointless. I’m sure the Pods will eventually be safer once there is nothing but Pods. The Pods will = a de facto mass transit system. Of course it will be safer – like a Disney ride all over the world. In the meanwhile I suspect there will be a somewhat sloppy transition period.

    To me, the better discussion is whether we really want a world where individuals can’t drive. We’ll be safer, but will we be somehow less human?

  • avatar
    ixim

    Continuously check your mirrors. Constantly memorize every vehicle around you. Be aware that at any time, one of them might get in your way. Always have an escape route should you need it. The vehicle in front of you has the greatest influence over your progress. Stay far enough away from him so you can safely escape if necessary. Constantly plan for the road ahead. Do all of this automatically while maintaining your chosen speed. It can be done. Happy motoring!

Read all comments

Back to TopLeave a Reply

You must be logged in to post a comment.

Recent Comments

  • Lou_BC: @Carlson Fan – My ’68 has 2.75:1 rear end. It buries the speedo needle. It came stock with the...
  • theflyersfan: Inside the Chicago Loop and up Lakeshore Drive rivals any great city in the world. The beauty of the...
  • A Scientist: When I was a teenager in the mid 90’s you could have one of these rolling s-boxes for a case of...
  • Mike Beranek: You should expand your knowledge base, clearly it’s insufficient. The race isn’t in...
  • Mike Beranek: ^^THIS^^ Chicago is FOX’s whipping boy because it makes Illinois a progressive bastion in the...

New Car Research

Get a Free Dealer Quote

Who We Are

  • Adam Tonge
  • Bozi Tatarevic
  • Corey Lewis
  • Jo Borras
  • Mark Baruth
  • Ronnie Schreiber