قالب وردپرس درنا توس
Home / Business / Should self-driving cars have ethics? : NPR

Should self-driving cars have ethics? : NPR



New research explores how people think autonomous vehicles should deal with moral dilemmas. Here people are walking in front of an autonomous taxi, which was demonstrated in Frankfurt last year.

Andreas Arnold / Bloomberg about Getty Images


Hide Caption

Caption

Andreas Arnold / Bloomberg on Getty Images

New research explores how people think autonomous vehicles should deal with moral dilemmas. Here people are running in front of an autonomous taxi that was demonstrated in Frankfurt last year

Andreas Arnold / Bloomberg on Getty Images

In the not-too-distant future, fully autonomous vehicles will drive our roads. These cars have to make decisions in fractions of a second, so as not to endanger people's lives – both inside and outside the vehicles.

To determine an attitude to these decisions, a group of researchers created a variation of the classic philosophical exercise "trolley" problem. "They posed a series of moral dilemmas that involved a self-driving car with brakes that suddenly gave way: Should Dodge the car to avoid a group of pedestrians, kill the driver, or kill people on foot, but spare the driver, is it important if the pedestrians are men or women, children or the elderly, doctors or bank robbers?

To ask these questions to a large number of people, researchers built a website called Moral Machine, where anyone could click through the scenarios and say what the car should do. "Help us learn how making a machine morally, "pleads a video on the site.

The grim game became viral several times.

" Really beyond our wildest expectations, " says Iyad Rahwan, an Associate Professor of Media Arts and Sciences at the MIT Media Lab, one of the researchers. "At some point we got 300 decisions per second."

What the researchers found was a set of almost universal preferences, no matter where someone took the quiz. Overall, people everywhere believed that the moral thing that the car should do was to protect the young from the old, spared humans from animals and spare the lives of many over the few. Their findings, led by Edmond Awad of MIT, were published Wednesday in the journal Nature.

Using geolocation, the researchers found that the 130 countries with more than 100 respondents could be divided into three clusters with similar moral preferences. Here they found some differences.

For example, younger people's favor over older ones was much stronger in the southern cluster (which includes Latin America, as well as France, Hungary, and the Czech Republic) than it was in the Eastern Cluster (to which many Asian and Middle Eastern nations belong). And the preference for humans over domestic animals was weaker in the southern cluster than in the eastern or western clusters (the latter includes, for example, the USA, Canada, Kenya, and much of Europe)

and they also found variations seemed to be observed with others Correlate cultural differences. Interviewees from collectivist cultures who "emphasize respect for older members of the community" showed a weaker preference for the protection of younger people.

Rawhan emphasized The results of the study should be used with extreme caution, and they should not be considered the last word on social preferences, especially as respondents did not provide a representative sample were. (Although the researchers made statistical corrections for demographic bias, they re-weighed the answers to fit the demographic data of a country.)

What does that mean? The authors of the article argue that if we leave these vehicles on our streets, their operating systems should take into account moral preferences. "Before we allow our cars to make ethical decisions, we need to have a global conversation to express our preferences towards the companies that design moral algorithms and the policy makers who regulate them," they write.

But let's just say for a moment that a society has general moral preferences for these scenarios. Should automakers or regulators take this into account?

Last year, the Ethics Committee Automated Driving created initial guidelines for automated vehicles. One of their keys dictated? A ban on such decisions by the operating system of a car

"In the case of unavoidable accident situations, any distinction between individuals can be based on personal characteristics (age, gender, physical or physical) mental constitution) is strictly prohibited, "the report said. "General programming to reduce the number of personal injuries can be justified, and those involved in generating mobility risks should not sacrifice uninvolved parties." But Daniel Sperling, founding director of the Institute of Transportation Studies at the University of California – Davis and author of a book on autonomous and shared vehicles, these moral dilemmas are far from the most pressing questions about these cars.

"The biggest problem is just making sure," he says NPR. "They will be much safer than human drivers: they do not drink, they do not smoke, they do not sleep, they are not distracted." So the question is, how sure must they be before we let them onto our streets?


Source link