Driverless Cars Lack Moral Judgement During Accidents
Driverless Cars Lack Moral Judgement During Accidents
There must be clear guidelines about whether to save a passenger or pedestrians in case of an accident.

Adoption of autonomous vehicles offers social benefits such as reducing air pollution and eliminating up to 90 percent of traffic accidents. Though such cars may reduce the probability of fatal accidents, they still face the dilemma regarding whom it should save among the pedestrians and the passengers. Autonomous driving systems will require programmers to develop algorithms to make critical decisions that are based more on ethics than technology, according to the study published in the journal Science.

"Not all crashes will be avoided, though, and some crashes will require AVs to make difficult ethical decisions in cases that involve unavoidable harm. They are keen to see adoption of self-driving technology because of major social benefits," the researchers said in the study. "A lot of people will protest that they love driving, but us having to drive our own cars is responsible for a tremendous amount of misery in the world," Shariff told a conference call.

The programming decisions must take into account mixed public attitudes. In a survey conducted by researchers, 76 percent of participants said that it would be more ethical for self-driving cars to sacrifice one passenger rather than kill 10 pedestrians. 23 percent said it would be preferable to sacrifice their passenger when only one pedestrian could be saved. And only 19 percent said they would buy a self-driving car if it meant a family member might be sacrificed for the greater good.

The responses show that people want to live a world in which everybody owns driverless cars that minimize casualties, but they want their own car to protect them at all costs.

There must be clear guidelines for when a vehicle must prioritise the life of a passenger or others, but it's not clear if the public will accept this. The focus on protecting public good may discourage people to buy automatic cars and would further delay the arrival of such cars in the market. The problem, it seems, is more philosophical than technical. Before we can put our values into machines, we have to figure out how to make our values clear and consistent.

For 21st century moral philosophers, this may be where the rubber meets the road.

What's your reaction?

Comments

https://popochek.com/assets/images/user-avatar-s.jpg

0 comment

Write the first comment for this!