Helping autonomous cars to make moral choices

To handle these relative preferences, we could equip people with beacons on their cellphones to signal nearby cars that they are a certain type of person (child, elderly, pedestrian, cyclist). Then programmers could instruct their autonomous systems to make decisions based on priorities from surveys or experiments like the Moral Machine.

Advertisement

But that raises serious problems. For example, would an autonomous car that noticed a child running in the middle of traffic decide to run over your grandmother on the sidewalk instead?

And what about groups of people? The Moral Machine’s creators and other researchers found that society as a whole has a strong preference for choosing to save more people. What if a negligent group of runners steered a car into your path while you walked alone?

The same study also showed that people would be less willing to purchase a vehicle that could include sacrificing the driver (themselves) as an option. If society as a whole is to benefit from the advantages of autonomous vehicles, we need people to buy the cars – so we need to make them more attractive to buyers. That might mean requiring cars to save drivers, as Mercedes has already decided to do.

Join the conversation as a VIP Member

Trending on HotAir Videos

Advertisement
Advertisement
Advertisement