Self-Driving Cars Facing Ethical Dilemmas
Self-driving cars have been a controversial subject for some time now. The biggest concern currently is that the cars can’t always make decisions that would prevent accidents.
The MIT Technology Review says that Christopher Hart, chairman of the National Transportation Safety Board, is one of the people who has expressed these concerns. He says that the National Highway Traffic Safety Administration will have to get involved and help create the rules and regulations for self-driving cars, such as having “designers of self-driving cars build in fail-safes for critical components of their vehicles” and addressing how the car will deal with potentially fatal situations. The example Hart gave was “a decision between a potentially fatal collision with an out-of-control truck or heading up on the sidewalk and hitting pedestrians.”
Alphabet and Uber are two companies that are testing self-driving cars, and as a result, California has a set of regulations towards the cars. These regulations include having “a safety driver always ready to take over,” and that the company keeps records of the incidents in which the driver had to take over.
Ryan Calo is an expert on robotics law at the University of Washington, and he believes self-driving cars would not be able to make the same decisions a human would make in certain situations. Patrick Lin is a philosophy professor at Cal Poly State University, San Luis Obispo, says that we shouldn’t be so quick to dismiss the judgements of the self-driving car. He believes that with technologies such as sensors, artificial intelligence and facial recognition, cars will be able to make critical decisions.
Crash optimization algorithms are the way that self-driving cars will be able to try and prevent accidents. A Slate article gives an example of a potential algorithm in which the car decides whether or not the car should hit a truck, a motorcyclist with a helmet or a motorcyclist without a helmet. While a definite answer to what the car would do in that particular situation was not stated, the article goes on to say that people “would probably be reluctant to purchase self-driving cars that are programmed to sacrifice owners in some situations.” In addition to this, the car’s ethics may not necessarily agree with those of the driver.
In Germany, they have just created a basic set of laws in regards to self-driving cars and the AI behind them, according to an Inverse article. These basic laws include “property damage takes always precedence of personal injury” and putting the responsibility on the manufacturer should any incidents occur, and these basic laws will be expanded upon by an ethics commission created by Alexander Dobrindt, Germany’s transport minister.
According to the National Conference of State Legislatures, many states are forming their own laws in regards to self-driving vehicles. At the moment, only California, Nevada, Utah, Arizona, North Dakota, Michigan, Tennessee, Louisiana, Florida and Washington D.C. have laws about self-driving vehicles. In 2016, Alabama, Georgia, Hawaii, Maryland, Minnesota, Virginia and Washington have tried to pass their own self-driving vehicle laws but have failed.