Self-driving cars may have to make the moral decision of who lives and who dies during a crash, according to a report.
“As you approach a rise in the road, heading south, a school bus appears, driving north, one driven by a human, and it veers sharply toward you. There is no time to stop safely, and no time for you to take control of the car,” USA Today explained. “Does the car: A. Swerve sharply into the trees, possibly killing you but possibly saving the bus and its occupants? B. Perform a sharp evasive maneuver around the bus and into the oncoming lane, possibly saving you, but sending the bus and its driver swerving into the trees, killing her and some of the children on board? C. Hit the bus, possibly killing you as well as the driver and kids on the bus?”
The moral dilemma has been heavily discussed with the advancement of self-driving vehicles.
According to USA Today, “Azim Shariff, an assistant professor of psychology and social behavior at the University of California, Irvine, co-authored a study last year that found that while respondents generally agreed that a car should, in the case of an inevitable crash, kill the fewest number of people possible regardless of whether they were passengers or people outside of the car, they were less likely to buy any car ‘in which they and their family member would be sacrificed for the greater good.'”
In their report, USA Today continued to claim that while self-driving cars “could save tens of thousands of lives each year,” consumer fears, particularly surrounding the crash dilemma, “could slow down acceptance, leaving traditional cars and their human drivers on the road longer to battle it out with autonomous or semi-autonomous cars.”
“Already, the American Automobile Association says three-quarters of U.S. drivers are suspicious of self-driving vehicles,” they proclaimed.
As Breitbart previously reported, MIT’s Moral Machine interactive website lets users explore the moral quandaries presented in life-and-death decisions being made by autonomous vehicles.