The Rise of Self Driving Cars And When They Will Take Over Conventional Driving
Since a few years ago, the much touted driverless car revolution was in full swing. Every car manufacturer had plans to bring about an autonomous vehicle into the market. Now, in 2018, the revolution is way behind schedule. There have been plenty of rumblings in the automotive market, but most of these driverless cars are in testing, or have not been deemed roadworthy to bring passengers around safely.
So what’s up with the self driving cars, and is it only a matter of time before they’re on us? Today, we explore this revolution and give you everything you might need to know.
Almost all carmakers set 2020 as the year that self driving cars will be taking to the roads, and will be poised to take over the roads by 2030. 2018 is ending but there hasn’t been a significant breakthrough vehicle in the horizon to meet that expectation. Instead, self driving cars are pumping the brakes as their technology has been unable to keep up with the unpredictable real-life roads.
Prototypes by Ford, Tesla, and the Google affiliate Waymo are barely smart enough to pass for an L license, moreso be allowed to take to the roads. There are other issues too, like the cars reacting to parked cars by the roadside as though they’re moving vehicles, or problems like difficult maneuvers through the sometimes tricky oncoming traffic. There are other more serious problems too, like the self driving Uber that was involved in a fatal accident with a 49 year old pedestrian crossing the street.
General Motors President Dan Ammann calls developing AVs that can navigate through urban traffic "the engineering challenge of our generation." Klaus Fröhlich, BMW's head of research and development, puts it more candidly: "Everyone in the industry is becoming more and more nervous that they will waste billions of dollars."
Most self driving cars are equipped with a technology called lidar, or “light detection and ranging” sensors. These sensors use apid light pulses to map a vehicle's surroundings in 360 degrees. These cars are also equipped with cameras, radar, and GPS antennas to take in information to enable it to drive on the roads. But even the most advanced technology has performed inconsistently on the road. Lidar has been proven to fail when there is heavy rainfall or snow, and sometimes fails to detect the lines on the road used to divide lanes.
"We're not even remotely close to being able to be truly autonomous in diverse conditions," said Austin Russell, CEO of lidar manufacturer Luminar, which can be pretty telling of an issue. Drivers rely mostly on human cues that technology has not been able to detect as of yet; the waving of an arm from another car, eye contact with pedestrians, and even some unspoken contact with other drivers on the road that help to predict their behaviors of driving. These are all cues that lidar or other sensing technology used in self driving cars cannot yet predict. However, there are engineers who believe that the technology can use their experience on the roads to “learn” more about these cues to help it function better on the roads.
So far, tests involving self driving cars have been going on reasonably well, but only when the roads are not too congested. However, since the beginning of the testing phase, there have been more than 100 accidents involving self driving cars. Most are minor accidents caused by other drivers or pedestrians. Waymo, which owns one of the more advanced self driving cars in the market now, has tested about 600 cars in 25 cities in the US, and has run computer simulations for 7 billion miles of driving.
Carmakers think that self driving cars are definitely safer than human-operated cars. The impression is that computers rarely make mistakes like drunk driving, get distracted by phones or even have road rage. Researchers say 94% of car accidents are caused by human error, and yet, there are surveys which state that only a quarter of people would feel safe in a car without a driver. It’s obvious that this stigma will not just go away in a short time.
But even beyond the tech challenges faced by these programmers, there are many ethical problems to be considered as well. How does a computer have ethics, you might ask. Well, consider this: should a car whose brakes fail crash into five people in a crosswalk or veer off the road and hit one person on the sidewalk instead? Should the car stop short for an animal in the road, even if doing so would endanger humans in that vehicle and in other cars? These are ethical questions that a human will have time to consider, but not for a computer who is programmed.
“Before we allow our cars to make ethical decisions," MIT researchers recently concluded, "we need to have a global conversation to express our preferences to the companies that will design moral algorithms."
So when will self driving cars become a reality? Well, there are still countless issues to address. Traffic laws for one would have to be revamped and looked at to accommodate the self driving cars. Plus in an age of cybercrimes, computerizing transportation invites all sorts of cybersecurity dangers, as a car's controls could be hacked and its movements easily monitored. However, the biggest hurdle that needs to be overcome is the fact that people need to be able to trust a robot to drive on the road instead of humans. According to Andrew Moore, the head of computer science at Carnegie Mellon University, says the self-driving car will not become popular "until there's proof that it's much safer, like a factor of 100 safer, than having a human drive."
Despite all the setbacks and the delays, carmakers are still ambitious that they will be able to mass produce self driving cars in the coming few years. They predict that the introduction of road-ready self driving cars will transform the market as much as when the first automobiles were introduced. And that will definitely be a time to behold.
What do you think? Leave a comment below!