With self-driving cars in our near future, I’ve seen more and more articles about the moral dilemma of what the car should do when faced with an impossible decision, for example, to either kill a grandmother or drive into a flock of children. In my mind, the pundits are getting it all wrong; the underlying assumption that humans can abdicate responsibility to machines and the car’s behavior must be predictable is plain wrong.
Here is how one pundit explains the problem:
Imagine that in the not-too-distant future, you own a self-driving car. One day, while you are driving along, an unfortunate set of events causes the car to head toward a crowd of 10 people crossing the road. It cannot stop in time but it can avoid killing 10 people by steering into a wall. However, this collision would kill you, the owner and occupant. What should it do?
Asking the question is important. However, giving a universally binding answer is not. The above article goes on to discuss the situation from an economic perspective (would you buy a car that would kill you rather than someone else?). It thereby falls into the same trap that I’ve seen all the other articles fall into: To ask for a universal decision, a societal consensus, to be programmed into the cars so it becomes predictable as to who the car will kill when faced with an impossible situation.
I suggest to simply leave it open. I don’t see how a society’s moral values can make such a highly personal decision for the driver. I certainly don’t see how a car manufacturer can cast that decision in software. It therefore should not be predictable how a self-driving car with a driver asleep will behave in these situations.
I haven’t thought much about how to implement it, but maybe asking the fresh owner of a car for his or her preference and then throwing in some randomness into the car’s behavior might be the right way to go. Whatever way the desired car behavior can be implemented, I think the underlying responsibility cannot be taken away from the individual driver.
Leave a Reply