Skip to main content
Technology

The arrival of the self-driving car presents a challenging new dilemma: Whom should the vehicle save – and whom should it harm – when an accident is unavoidable?

The Mercedes-Benz F 015 Luxury in Motion concept uses a high-precision laser projection system to project a broad cone of light onto the road, communicating important information to pedestrians

As self-driving technology creeps into vehicles, the debate over how they should behave in situations they can't anticipate is sharpening.

Multiple studies estimate that autonomous cars would dramatically reduce road accidents – up to 90 per cent, according to a 2015 report by McKinsey & Company.

No one believes accidents will be eliminated entirely, which brings up an ethical dilemma: Who should the car harm if it finds itself in one of those unavoidable situations? Do children, elderly people or other factors change the equation?

"Nobody's talking about ethics," Ford Motor Co. chairman Bill Ford said a year ago, speaking with The Globe and Mail's Greg Keenan and a small group of reporters. "If this technology is really going to serve society, then these kinds of issues have to be resolved, and resolved relatively soon."

We already have the technology.

"The greater challenge is the artificial intelligence behind the machine," Toyota Canada president Larry Hutchinson said, addressing the TalkAuto Conference in Toronto last November. "Think of the millions of situations that we process and decisions that we have to make in real traffic. … We need to program that intelligence into a vehicle, but we don't have the data yet to create a machine that can perceive and respond to the virtually endless permutations of near misses and random occurrences that happen on even a simple trip to the corner store."

Consider, he said, the ethical dilemma that an autonomously driven car would need to resolve in an instant when a child jumps suddenly into the car's path from the curb. There's no time to brake. What then? Veer left into oncoming traffic, possibly causing an accident that would injure the driver and passengers? Veer right onto the sidewalk, possibly injuring pedestrians? Continue straight, possibly colliding with the child?

Germany last year became the first country to attempt to answer such morbid questions with actual guidelines. The proposed rules state that self-driving cars should always attempt to minimize human death and shouldn't discriminate between individuals based on age, gender or any other factor. Human lives should also always be given priority over animals or property.

More than three-quarters of participants support such a utilitarian approach, according to a 2016 study published in Science. That is, unless they happen to be in such a car, in which case the majority would want it to protect passengers – and themselves – at all costs.

This revelation suggests the issue isn't solely about ethics, but rather it's also about control. People think ethically in principle, but in practice they behave more selfishly, especially if they are not in command of a situation.

Azim Shariff, an associate professor of psychology and social behaviour at the University of California in Irvine and co-author of the study, says this control factor is fuelling a double standard in how people view themselves and machines.

Human drivers who get into accidents must attempt to make subconscious, split-second moral choices, and often err on the side of protecting themselves, which can result in harm to others. As a result, even in the case of horrifying accidents, we tend to cut them some slack.

"We don't judge people for [that] because we recognize the human frailties involved," he says. "We recognize that people are going to work out of self-preservation instincts and in the heat of the moment."

Machines, on the other hand, are viewed more coldly because they don't have those human frailties. So perhaps the hypocrisy is warranted.

"Because they're programmed in advance by people not immediately in the situation, we actually do have the luxury of deliberation [with cars]," Shariff says. "We have the responsibility of deliberation, so yes there's a double standard, but there's a good reason for it."

Not everyone is convinced. Some participants in the debate – especially on the side of car manufacturers – have pointed out that we're putting unfair expectations on self-driving vehicles given the benefits they promise.

Challenges "like ethics, regulations, infrastructure and societal readiness" may take longer to resolve than the technology to develop, Toyota's Hutchinson said. "These are big questions, and it's going to take the combined efforts of governments, insurance companies, manufacturers and all kinds of associations to answer them."

In reference to the dilemma of picking victims in the case of an accident, "I don't remember when I took my driver's licence test that this was one of the questions," said Manuela Papadopol, an executive for Continental AG subsidiary Elektrobit.

Angela Schoellig, head of the dynamic systems lab at the University of Toronto's Institute for Aerospace Studies, points out a further inconsistency in the double standard. The ethical concerns over what self-driving cars might do in the future also don't seem to apply to existing vehicles and their current capabilities.

"The emergency brake [for example] can cause injuries that may not have been caused if the human did something," she says. "It's underestimated how much of that is there already today."

The Mercedes-Benz F 015 Luxury in Motion research car.

Melissa Cefkin, an anthropologist and principal researcher at the Human Centered Systems practice at the Nissan Research Center in Silicon Valley, believes people will eventually get comfortable with autonomous vehicles once they realize they aren't that unusual. Riding in a human-driven taxi today, after all, requires giving up control.

"People want to believe it's going to be this 100-per-cent brand-new experience they've never had," she says. "Chances are it's going to feel an awful lot like experiences they have had."

Self-driving cars will also introduce new forms of human control. Not having to own a vehicle and hiring one only when it's needed will itself be a major new form – one that will likely give many people more control over their monthly finances. Passengers are also likely to have more choice over vehicle settings, preferences and routes, not to mention what they do while they're in the car.

Cefkin likens the uncertainty over self-driving cars to the angst that accompanied cellphones when they first arrived.

"People felt like they weren't as much in control as to when they communicated with people, but we adapted," she says. "That feeling itself is socially constructed and there will be some replacement feeling of control in the future, it just won't look the way it does today."


Shopping for a new car? Check out the new Globe Drive Build and Price Tool to see the latest discounts, rebates and rates on new cars, trucks and SUVs. Click here to get your price.