For the first time, a Google self-driving car is at fault for causing an accident.
On Valentine’s Day a Lexus RX 450h with Google’s self-driving technology leaned in a little too close to a city bus in Mountain View, California. The two vehicles collided at low speed.
Google has acknowledged at least 17 other accidents involving its cars, but this is the first time it has admitted at least some fault.
This news comes as a new AAA survey claims 75 percent of drivers wouldn’t feel safe in a self-driving car.
Automakers and tech companies are scrambling to make autonomous cars a reality, but are they forcing the issue before anyone is really ready for them?
Regarding the accident with the bus, Google said the car was in the right lane of a city street, about to turn right. After moving to the right side of the lane, it moved back to the center of the lane to avoid sand bags that had been placed around a storm drain. The bus, coming from behind, hit the left side of the car.
In a statement, Google said,
This type of misunderstanding happens between human drivers on the road every day. This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements.
That’s where self-driving cars will have trouble. Programming them to respond like people is no simple task but, then again, humans aren’t exactly foolproof when it comes to avoiding accidents either.
The difference is that people tend to trust their own reactions more than they can trust a computer’s.
Like any new technology, getting used to self-driving cars will take time. About 75 percent of drivers say they’re fearful of letting a computer take control, according to AAA. The organization surveyed 1,800 people and found that the majority of them feel that self-driving technology is too new and unproven to be reliable and trustworthy.
CNN ran a story that said,
John Nielsen, AAA’s managing director of automotive engineering and repair, said tests suggest drivers may be overestimating their own abilities. He also believes they will be more likely to trust self-driving cars as they become more familiar with features such as automatic braking or parking.
I can vouch for that one. I remember the first time I drove my now-wife’s 2013 Subaru Legacy with EyeSight and adaptive cruise control. Trusting that the car would sense the vehicle in front of me and slow down without my foot on the brake pedal was borderline terrifying. Today it’s a feature I’ve grown to trust and I use it worry-free on a daily basis.
The same thing will happen with autonomous cars. I’m of the belief that a human driver should always be at the wheel to take over in case of a malfunction, but those malfunctions will become few and far between.
Do you trust self-driving cars? If not, what would make you more likely to trust them?