Click For Photo: https://regmedia.co.uk/2018/10/24/car_crash.jpg
The question of the infamous trolley problem for self-driving cars has finally been answered.
Imagine a robo-ride is about to crash into either a kid or a bunch of elderly people. It cannot brake in time, nor swerve out of the way. Where should it go? Who should it hit, or rather, who should it spare?
Scenario - Time - Choice - Humans - Animals
Now, imagine the same scenario but this time the choice is between humans or animals, jaywalkers or law-abiding citizens, males or females, fitties or fatties... you get the idea. How should computers deal with these split-second moral decisions?
Millions of participants from more than 200 countries answered these hypothetical questions for an experiment dubbed the Moral Machine. It was set up by researchers from MIT and Harvard University in the US, University of British Columbia in Canada, and the Université Toulouse Capitole in France.
Results - Subgroups - People - Moral - Machine
Graphic showing the results for different subgroups people preferred to spare, according to the Moral Machine results. Image credit: Awad et al. and Nature.
Users faced 13 different possible scenarios with two possible outcomes, and were asked to click on the option they preferred more. The results were published in Nature on Wednesday.
Participants - Groups - People - Pedestrians - Humans
It’s not too surprising that most participants favored the young over the old, groups of people over lone pedestrians and humans over animals. They were also more...
Wake Up To Breaking News!