Posted by & filed under AI/Artificial Intelligence, Automation, Emerging Technologies, Ethical issues, IT and the law, Self-driving vehicles.

To understand the trolley problem, first consider this scenario: You are standing on a bridge. Underneath you, a railroad track divides into a main route and an alternative. On the main route, 50 people are tied to the rails. A trolley rushes under the bridge on the main route, hurtling towards the captives. Fortunately, there’s a lever on the bridge that, when pulled, will divert the trolley onto the alternative route. Unfortunately, the alternative route is not clear of captives, either — but only one person is tied to it, rather than 50. Do you pull the lever?

Thanks to the arrival of autonomous vehicles, the trolley problem will be answered—that much is unavoidable. More importantly, though, that answer will profoundly reshape the way law is administered in America.

Source: Wired Magazine

Date: September 21, 2017

Link: https://www.wired.com/story/self-driving-cars-will-kill-people-who-decides-who-dies/

Discussion

1) “Like any computer, a driverless car will not do anything unless instructed. A programmer can’t simply give it instructions for most scenarios and avoid thinking about edge cases. ”   What sorts of ways could we help the programmers?

2) Is this an ethics issue or a numbers (of deaths) issue?

 

Leave a Reply

Your email address will not be published. Required fields are marked *