Tesla Autopilot Crash Raises Questions About Autonomous Cars

2016_tesla_model_s_85d

It’s not easy to write about death.

Death, however, is a tragic and so far unavoidable part of automotive culture. In 2014, an average of 89 people died per day in car crashes. Worldwide, the numbers are far larger: An average of 3,287 people die every day in cars.

Those are sobering numbers and even more powerful when you consider that every one of those deaths was a mother, father, son, or daughter.

As common as deaths on international roadways are, one tragic accident has made headline news for being the first to happen in a self-driving car.

Have drivers already become too trusting of autonomous technology?

Joshua Brown died in a crash in Florida while relying on the Autopilot system of his Tesla Model S.

The Autopilot system, which uses sensors and cameras to detect potential obstacles on the road ahead, didn’t see a semi truck crossing in front of Brown’s car and never applied the brakes, according to Tesla’s blog post on the incident. The Model S drove at full speed under the truck’s trailer, and the bottom of the trailer hit the vehicle’s windshield. Tesla notes that the circumstance is extremely rare and that, had the Model S collided with the back of the truck, the driver would have survived.

This naturally raises more than a couple of questions about self-driving cars. Two of the biggest are:

  1. Who shoulders the responsibility when an autonomous car gets in an accident?
  2. Can we trust autonomous cars?

Determining responsibility for the accident is difficult. Drivers must acknowledge a litany of warnings about Autopilot before it can be activated and are warned numerous times to keep their hands on the wheel and be ready to take control at any time. A Forbes story, though, presents the arguments for why Tesla may be liable for the crash. One goes like this:

Further, most people don’t read the terms and conditions of their software agreements, which often takes a law degree to understand. Even when they do understand what they’re getting into, they don’t really understand.

Does that mean we shouldn’t trust autonomous cars? A CNN story says,

Autonomous-driving technology has already been credited with saving lives. Most notably, safety regulators and major automakers have agreed to make automatic braking — which detects when a vehicle ahead stops and automatically applies the brakes if the driver doesn’t — standard equipment on all new cars.

Statistics show that this technology can prevent dangerous crashes that, in 2012, caused 1,700 deaths and half-a-million injuries in the U.S. Of course, no one claims it can prevent every crash.

Tesla takes that argument even further and says,

This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the U.S., there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.

There’s no question that autonomous cars, or cars equipped with automatic braking, greatly reduce the odds of a fatal accident. As of right now, though, there is no “death-proof” car and all drivers, whether they’re fully in control of their vehicles or allowing the car to temporarily handle driving responsibilities, need to stay focused on the road because even one death per day is too many.

Will you trust autonomous cars to keep you safe?

-tgriffith

Find Certified Pre-Owned Cars and Used Cars in your area at CarGurus.

Used Tesla Model S

2 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

*
*
Website