Dirk Riehle's Industry and Research Publications

The Argument For a Moral Machine in Autonomous Driving

I have a strong aversion against letting people drag their feet from being responsible for their actions. I feel particularly strongly about this when delegating work to machines, which are not able to act using an appropriate moral value system. Starting a car and letting an autonomous driving unit take over is one such example: When faced with an impossible situation (run over an old lady or three children or commit suicide), it still has to be the driver’s decision and not a machine’s.

To that end, I proposed a research project that lets a driver encode their value system in what I called a moral machine that they can then delegate decisions to, knowing the machine will truthfully produce their owner’s behavior. Sadly, the companies I pitched this project to rejected the proposal, arguing that consumers will never buy a car that in one form or another will ask them whether they value their life higher than a grandmother’s or those of three children.

When discussing this problem with a friend, he pointed out that it won’t be the driver’s problem soon, because in the future, nobody will own cars anyway. We will just hop on the next free autonomous taxi, which will take us where we want to go and that’s that. The moral problem of who to kill then becomes a problem of the taxi service provider, not the user of the service.

This change in responsibility may change everything and give my proposal a new thrust. A mobility service provider, when faced with the options to kill the old lady, or to kill the three children, or to kill the passenger, will have to consider the financial consequences of their actions. They will have no contract in place with the old lady or the three children, but they will have a contract in place with the passengers and will likely use that to limit their liability in case of passenger damage or death. Then, the financial risk of passenger death becomes much more manageable than that of the old lady or the three children.

Passengers would obviously not accept being put at a disadvantage, but rather request that it become their decision what happens. For this to be possible, we need a solution, where we can encode our moral values in some device, which we then make available to the taxi as we initiate the ride. It should be as easy as slamming down our mobile device onto some reader. Society may disagree, but that’s a different story.

Certainly a fun project. Not sure it can be done, but I feel confident now that more parties will pull into the direction of not abdicating responsibility to a machine.

Go back to: The argument against a moral machine in autonomous driving.

Subscribe!

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

  1. […] position does not discourage automation, only unreflected one. As I argue in the case for a moral machine in autonomous driving, there is nothing wrong if a machine makes decisions on behalf of a human, as long as it adequately […]

  2. […] Next up: The argument for a moral machine in autonomous driving. […]

  3. Dirk Riehle Avatar

    Is your suggestion an approach to gathering user input from which to derive the autonomous (“moral”) behavior? (I’m not sure I understand.) So, no questionnaire in an armchair, but rather observe driver in action and derive desired behavior in case of a problem situation from that?

  4. Martin Stein Avatar

    I think this is a very good proposal. I would change the labeling a bit. Instead of supplying a moral system, I would let people push a gas pedal that makes the car drive more aggressive. Later this could become a slider and at some point configuration setting in my passenger profile.

Navigation

Share the content

Share on LinkedIn

Share by email

Share on X (Twitter)

Share on WhatsApp

Featured startups

QDAcity makes collaborative qualitative data analysis fun and easy.

Featured projects

Open data, easy and social
Engineering intelligence unleashed
Open source in products, easy and safe