
Would a Self-Driving Car Choose to Kill 2 of Its Passengers or 5 Pedestrians? Depends On Who Programs It
Digital ethics have never been so important.
A recent panel of experts at the WISE@NY Learning Revolution conference discussed the importance of the humanities as our world becomes increasingly technologized. Edsurge captured this highlight well:
Take the example of self-driving cars, said Keren Wong, director of development of RoboTerra, a robotics education company. She called attention to the “Moral Machine,” an ethical quandary posed by MIT professor Iyad Rahwan. The dilemma goes as follows: an autonomous vehicle is in a situation where it must make one of two choices: kill its two passengers, or five pedestrians.
Both options are tragic, but speak to a reality where technologists must program machines that make decisions with serious implications. “If we are leaving these choices in the hands of machine intelligence, then who are the people who will be programming these decisions? Who are the ones that are going to be setting up the frameworks for these machines?” asked Wong.
It is for precisely such reasons that we need to teach all children and teachers about computer science. Not because industry needs more coders, but first and foremost because what it means to be human is becoming increasingly computational.
You must log in to post a comment.