Recently, Uber put self driving cars back on the road, nine months after a woman was struck and killed by one in Arizona. This action represents a very pared down return to the company’s self driving car initiative, and includes many safety precautions.
Still, the decision is a controversial one for Uber. It also brings up many concerns about self-driving cars in general. Namely, are they safe or not? As of now, the answer is a bit complex.
Where Does Blame Fall When a Self-Driving Car is in an Accident?
The entire debate on the safety of self driving cars is a bit complex because perceptions can be skewed. For example, when a news headline states that someone was killed in a self driving car crash, the assumption is that it was the technology that caused the accident. That’s not always true. In fact, the overwhelming majority of car accidents in self driving cars are caused by the actions of humans.
Before discussing whether or not self driving cars are safe, it’s important to understand exactly what a self driving car is. When discussing specific incidents with specific vehicles, it’s even more important to know what kind of technology was actually being deployed, as well as any contributing factors. Without that knowledge, people assume that the new and unfamiliar technology is to blame.
A Quick Primer on Self Driving Cars
As of now, there are no, fully driverless cars out on public roads, other than a few highly controlled exceptions. Instead, self-driving cars are simply vehicles where technology is used to automate driving functions. At one end of the spectrum are cars with features such as lane assist and cruise control. At the other are cars that are considered to be ‘level 3’. These cars are fully self driving, but a driver must be present to take over if the situation warrants. Finally, there are level 4 cars. These are fully self-driving and do not require human intervention at all. These are currently only produced as prototypes.
What Makes Self Driving Cars Safe?
Other than cost savings, one of the main motivators behind automated car technology is safety. The fact is that 94% of car accidents, in manually driven cars are caused by human error. In fact, many accidents involving self driving cars involve autonomous vehicles being struck by cars driven by humans.
Every manufacturer that is involved in the design and manufacture of automated cars put cites safety as a major factor. The bottom line is that an automated vehicle, driven by sensor technology and AI has the capability of making safer driving decisions in more situations than a person.
In addition to this, car manufacturers are continually working to improve this technology. Self driving cars are only going to become safer as time goes by. They aren’t impacted by lapses in judgement or emotions. Once they incorporate knowledge, they don’t forget it.
Compare a self driving car with an inexperienced teen driver, an exhausted driver, and an angry driver. The self driving car is not going to be impacted by anything other than its understanding of the rules of the road, and the input it receives from sensors.
What Makes Self Driving Cars Unsafe?
Of course, it’s silly to debate this issue without acknowledging that there absolutely have been technology failures in self-driving cars, and that those have led to tragedy in some cases. It’s imperative to understand where potential dangers lie, and come up with solutions to avoid them.
One of the first things to be discussed here is cybersecurity. There have been cases where hackers have taken over control of automated vehicles causing accidents and other hazardous situations. Until better protections are put in place, that is going to continue to be an area of concern.
There’s also the failure of car owners to properly maintain their automated vehicles. Gabriel Levin, Partner/ Co-founder at The Levin Firm says, “When an an individual or business invests in technology like this, they have an obligation to behave responsibly and take action to reduce the potential of harm to others. If someone is hurt in an accident, because a car owner failed to update their vehicles firmware or didn’t follow operating guidelines, the car owner could be liable. Commercial businesses and fleet owners have a further obligation to ensure that their contractors and employees operate these vehicles safely as well.”
The last point made illustrates another area of danger in self driving cars. That’s the potential for driver error. Since all but a few, every experimental vehicles on the road still require drivers to be involved, human error is still a risk. For one thing, without proper education, drivers may overestimate the capabilities of driverless vehicles. There have been fatalities where a driver was too inattentive to realize they needed to take over control of their vehicle.
Ironically, safety issues can also occur when drivers attempt to take over the controls when they shouldn’t. This often boils down to the perception that people are going to make better decisions than a computer, and that simply isn’t true. In all likelihood, the safest automated vehicle may be one that is ‘level 4’, and allows for no human intervention.
Testing and Training are Key
It’s pointless to deny that self driving cars are inevitable. There may be debate about when it will happen, but the truth is that eventually we will share the road with these vehicles. The best way to ensure everyone’s safety, is with rigorous and thorough testing, and by training drivers effectively.
Unfortunately, there are some serious issues emerging with many of the testing protocols that are currently in place. For one thing, many of the current pedestrian detection systems haven’t been effectively tested on people of color. Vehicles may not be able to adequately see people with darker skin.
There’s also not enough communication or setting of common standards among manufacturers. Because so much of the manufacturing process is competitive and proprietary, nobody is sharing information with one another. So, nobody is benefitting from the testing processes and discoveries that other manufacturers are making as they engage in safety testing.
Another issue to consider is training. As of now, the largest focus is on training fleet drivers and other professionals to safely operate self driving vehicles. Again, there are no universal standards for this. There are no operator’s licenses or commercial driver endorsements for operating autonomous vehicles. There are no driving tests administered, or classes anyone must complete to drive an autonomous vehicle.
In the future, as this technology moves into the public sector, this could cause some real issues. Is someone qualified to drive a self driving car, simply because they have a license to operate a regular vehicle? If not, how do we go about testing and licensing drivers to ensure they have the skills to operate these cars safely.
The Environmental Impact
One of the benefits of self driving cars are that they remove barriers for people who cannot currently drive. This includes the disabled. That’s undeniably a good thing, but this will lead to more vehicles on the road. That could have an environmental impact. While that may not lead to an increase in accidents, illness and injury can be caused by air pollution and traffic congestion.
Driverless vehicles are potentially safe. In fact, they have the potential to be significantly safer than manually driven vehicles. However, for that potential to be realized in real life situations, there must be effective driver training, adequate testing, technological improvements, and better understanding of this technology.