Should a robo-car run over a kid or a grandad? Healthy or ill person? Let's get millions of folks to decide for AI...

www.theregister.co.uk | 10/25/2018 | Staff
baileyboo (Posted by) Level 3
Click For Photo: https://regmedia.co.uk/2018/10/24/car_crash.jpg

The question of the infamous trolley problem for self-driving cars has finally been answered.

Imagine a robo-ride is about to crash into either a kid or a bunch of elderly people. It cannot brake in time, nor swerve out of the way. Where should it go? Who should it hit, or rather, who should it spare?

Scenario - Time - Choice - Humans - Animals

Now, imagine the same scenario but this time the choice is between humans or animals, jaywalkers or law-abiding citizens, males or females, fitties or fatties... you get the idea. How should computers deal with these split-second moral decisions?

Millions of participants from more than 200 countries answered these hypothetical questions for an experiment dubbed the Moral Machine. It was set up by researchers from MIT and Harvard University in the US, University of British Columbia in Canada, and the Université Toulouse Capitole in France.

Results - Subgroups - People - Moral - Machine

Graphic showing the results for different subgroups people preferred to spare, according to the Moral Machine results. Image credit: Awad et al. and Nature.

Users faced 13 different possible scenarios with two possible outcomes, and were asked to click on the option they preferred more. The results were published in Nature on Wednesday.

Participants - Groups - People - Pedestrians - Humans

It’s not too surprising that most participants favored the young over the old, groups of people over lone pedestrians and humans over animals. They were also more...
(Excerpt) Read more at: www.theregister.co.uk
Wake Up To Breaking News!
Sign In or Register to comment.

Welcome to Long Room!

Where The World Finds Its News!