Skip to main content
technology

lp3/Getty Images/iStockphoto

Would you ride in a self-driving car if it were programmed to kill you in an unavoidable collision rather than a group of pedestrians?

According to a survey published in Science in 2016, most people said "no."

At the same time, three-quarters of those respondents also said autonomous vehicles should take a utilitarian approach when it comes to humans lives – they should seek to minimize deaths in a crash even it means killing their passengers.

It's a seemingly intractable dilemma in which the average person believes self-driving cars should act logically rather than selfishly, except when they themselves are in the vehicle.

A group of researchers in Italy believe they have a solution to this problem: an "ethical knob" in vehicles that would allow passengers to set their own levels of self-interest.

Such a knob would have three basic settings: extreme altruism, which would direct the car to preserve other lives at the expense of the passengers; utilitarian, where it would act based on logical calculations; or extreme egotism, where it would protect passengers at all costs.

"The [vehicle] would correspondingly be entrusted with implementing the user's choices, while the manufacturer/programmer would enable the different settings and ensure their implementation … according to the user's choice," says the study, published in the Artificial Intelligence and Law periodical last month.

"With the ethical knob, the [vehicle's] decisions in the face of moral-legal dilemmas would depend on the customization chosen by the user."

Giuseppe Contissa, the study's lead researcher at the University of Bologna, says such a knob might be necessary to achieve popular acceptance of self-driving cars. As the earlier Science study showed, people are hesitant to use technology over which they have no control.

He acknowledges, however, that it could also lead to another problem – a situation where everyone acts against the common good by prioritizing their own self-interests.

"There is the risk that all the people will be incentivized, if they had an ethical knob installed, to have it set to egotist, so you would have a sort of tragedy of the commons."

A possible counter to that, the study notes, would be to link insurance premiums to knob settings. Passengers who choose to drive altruistically could receive lower rates while those who opt for egotism would pay more.

The downside to that approach is familiar territory when it comes to new technology – that only wealthy people get to take advantage of it.

In this case, it wouldn't just mean access to more luxurious features, but potentially the difference between life and death. A dystopia can be envisioned where the rich drive around with egotist mode on while the poor are forced to settle for the altruistic setting.

On the other hand, ethics experts say such a situation wouldn't be much different from what currently exists.

"Wealthy people are able to buy [expensive] cars with all these gadgets on them that increase their safety. All these things start on high-end brands," says Scott Campbell, director of the University of Waterloo's Centre for Society, Technology and Values.

"It's an inevitability of the unequal distribution of technology in society rather than something specific to self-driving cars."

Campbell likes the idea of an ethical knob because it could give people a feeling of control in the face of what is looking like an inevitability. At the same time, he's not sure people would be interested in using it.

"People don't want to think about these things," he says. "They want a system to take over and do a lot of the thinking for them and [believe], 'It'll probably work out as long as it doesn't hurt me.'"

Contissa agrees that more research on whether people really want to make such decisions for themselves does indeed need to be done. To that effect, he and his fellow researchers are in the process of designing simulation experiments.

"The problem is whether [an autonomous car] should act as an extension of the person using it, or should it have a pre-programmed morality decided by the market or legislators or whoever it is," he says.

Some governments are already forging ahead on the ethical issues. Germany, for example, in August became the first country in the world to implement ethical guidelines on self-driving cars.

Autonomous vehicles must place human lives above animals and property and treat all people equally in life-and-death determinations, among the government's other requirements.

Ready to ride? Autonomous, self-balancing, and robot-driven. Yamaha brings next-generation technology to the Consumer Electronics Show in Las Vegas.

Reuters

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe