Autonomous cars will be required to make value judgments which must be pre-programmed. What should we tell them to do? To understand decisions human drivers would make before determining the ethical decisions that cars should make, researchers crowdsourced the question by launching a game called Moral Machine in which players are presented with a version of the trolley problem. The results suggest that if billions of driverless cars in the future are all programmed to make the same judgement call, it may be a lot more dangerous for some people to cross the street than others.
The New Yorker, 24 January 2019See More