Driverless Technology Faces Its Ultimate Problem: The Drivers
With driverless technology making inroads and cars becoming smarter, it seems as though driving has become easier and less likely to cause accidents on the road. However, a recent case involving a Tesla Model S that was involved in a crash has increased the tension between self-driving technology, and the drivers themselves.
A Tesla Model S that was cruising at highway speeds rammed into the back of a parked fire truck on a the highway. The driver in the Tesla claimed that the car was running on Tesla’s autopilot system. But the question is; shouldn’t he still have seen, and stopped for, the fire engine up ahead?
When cars start doing the work, drivers overestimate the capabilities of their cars and disengage entirely. There is an increased need by car companies with automated cars to educate drivers about what their cars can and cannot do.
A car’s autonomy is rated on a six-step scale, from Level 0, a normal car, to Level 5, where drivers will enter a destination and sit back for the ride.
Nearly all autonomous features available on showroom floors today are still at Level 2. Cars can park themselves and monitor lane lines, but watching the road should still be a human task. Still, some drivers assume Level 2 vehicles can “take control,” which could cause problems on the road.
“We are at the beginning of a brave new world, at the advent of innovative technologies in autonomous vehicles,” said Joan Claybrook, a former administrator of the National Highway Traffic Safety Administration.
Claybrook said driverless technology is at a dangerous moment, where drivers don’t quite understand what their car’s system means.
“The interplay between humans and technology, technology being in control of driving, puts us at a risk. Driver distraction could have been a deadly factor in the Tesla crash [with the firetruck],” Claybrook said.
New “Enhanced Autopilot” technology is a $5,000 option for Model X and Model S Tesla vehicles, which each run customers well over $60,000. These vehicles are all equipped with the capability for full autonomy, though that won’t be available for years.
Tesla’s website touts the power of its autopilot technology, like how it can switch lanes, switch freeways and even be summoned from a garage. But the Model S manual outlines circumstances where this safety system might fail, like when the car is approaching a stationary object.
There is a need for lawmakers to require automakers to educate their consumers on what their autonomous car can and cannot do. But Jason Levine, an advocate with the Center for Auto Safety, said it would be hard to imagine an education program for the “endless variations” of autonomous technology.
“Education has not been effectively addressed by those that are working to get the cars to the marketplace without thinking through the ramifications,” Levine said.
“Consumers need to know what they can and cannot do, and how they can interact with vehicles, or we’ll have consumer pushback,” said Sen. Gary Peters, a Michigan Democrat who co-authored the bill.
Waymo, Alphabet’s self-driving car company, partnered with organizations like Mothers Against Drunk Drivers and the National Safety Council to form a “Let’s Talk Self-Driving” Campaign. The campaign promotes an auto recall database and a website called MyCarDoesWhat.org to educate drivers on new safety technologies to prevent crashes.
What do you think? Do you think it’s necessary for people to be educated on what their cars can or cannot do? Leave a comment below!