Opinion

Driven by self-interest?

Written by Frances Chassler

The brakes have failed; there’s a cat in the passenger seat; a dog in the left lane, and a elderly woman to the right. What do you do? Well, it’s too late; you’ve probably crashed and conveniently avoided the moral issue in front of you. But with self-driving cars quickly becoming a reality, the age old “trolly problem” is being dragged back into the spotlight.

The trolly problem is an ethical thought experiment based around action and inaction. The basic idea is that you are on a train and the brakes have failed. There are five people stuck on the track in front of you and if the train continues on its course they will die, but there is an option to turn the train onto a different track which will result in only one person being killed. This is the original idea and most people tend to turn the train and kill one person over five. The question can be altered in many ways to make people consider different things: the one person can be a child and the five people can be elderly. Maybe the five people are criminals and the one person is an upstanding citizen. Perhaps the trolly driver can crash and kill themselves to save all six people?

Everyone’s approach to these choices is different and there is no clear cut right answer to the problem, yet we are going to need to program something into self -driving cars in case they find themselves in a situation where they will need to make a moral decision involving human life.

MIT media lab has launched a website that surveys people using a game called “Morality Machine”. Participants answer a number of variations of the trolly problem, but so far the results are inconclusive. Unsurprisingly, people have trouble agreeing on the answers to unanswerable questions.

Unfortunately, capitalism has made it so that the most important thing about self-driving cars is that they get sold.  The idea that the car you’re in could make an executive decision to crash into a brick wall and kill you is off-putting to some customers. In fact, according to the Moral Machine website, in about half the scenarios people will choose to sacrifice the passenger to save pedestrians, but when questioned about whether they would buy a self-driving car that was programmed to sacrifice the driver, the majority of people said they wouldn’t.

Self-driving cars need to become mainstream sooner rather than later. At the end of the day, you are far less likely to die in a self-driving car even if it has been programmed to sacrifice you. UK Government statistics show that between January and March 2016 there were 430 people killed in road accidents. Machines are simply better drivers than us; they never get drunk, tired or distracted, and would honk less.

The fact is, the situations in the trolly problem are hypothetical; they are fun to discuss but shouldn’t be allowed to delay the introduction of self-driving cars. Real people are dying on the roads and we’re debating whether you hypothetically should save a family of five or a handful of pygmy hippos.

About the author

Frances Chassler

lol no

Leave a Reply

%d bloggers like this: