
(In keeping with our promise to share thought-provoking fodder with our readers, we sometimes run articles published by TTAC’s sister sites. This look at recent crashes involving self-driving Chevrolet Bolts, penned by GM Inside News head honcho Michael Accardi, touches on a number of themes we’ve explored in these pages. Are humans really to blame for all of the accidents involving “perfectly safe” autonomous vehicles, or is the real picture not as crystal clear? Read on.)
The autonomous Chevrolet Bolts GM’s self-driving startup has running around San Francisco have been involved in 22 accidents during 2017 – none of which were the software’s fault (legally, that is).
Cruise Automation has been using a fleet of self-driving Chevrolet Bolts to log autonomous miles in an urban environment since GM purchased the company for more than $1 billion in 2016. When you’re trying to disrupt personal transportation as we know it and develop a new technology standard, there are bound to be a few incidents.
But this hybrid model of humans and algorithms sharing the road is more complex than simply apportioning blame based on the law, isn’t it? None of the 22 incidents involving GM’s Cruise fleet were serious, but a majority of them were caused by a fundamental difference in the way autonomous and human drivers react. Read More >
Recent Comments