Skip to main content
driving concerns

What happens when a car with self-driving features, such as Tesla’s Autopilot, breaks a law or gets in a crash? If I’m driving, am I responsible? Could I still be charged? What happens with my insurance? Tesla owners I know argue their cars are basically self-driving, even though they have to intervene regularly when the system stops working. – Kim, Vancouver

In every car with driver-assist features available in Canada right now, it’s up to the driver to watch the road – so if you snooze, you lose.

That could eventually change if we see more advanced self-driving technology. But we’re not there yet. Humans are still on the hook for what happens behind the wheel.

“Ultimately, when using autonomous technology, drivers are still responsible to be alert to their surroundings, be able to intervene with the controls if necessary in the event of an emergency and ensure that traffic laws are being adhered to,” said RCMP Constable Mike Moore, a spokesman for B.C. Highway Patrol.

You can’t buy an autonomous car in Canada. You can buy semi-autonomous cars – but they cannot drive themselves reliably and safely in all conditions.

If you get in an at-fault collision in a car when you’re using Tesla’s Autopilot, for example, you’re responsible – both legally and when it comes to your insurance. So, if you were found at fault, your premiums would increase.

“You as a driver of the vehicle are in control,” said Anne Marie Thomas, director of consumer and industry relations with the Insurance Bureau of Canada. “These are safety features, but you ultimately are still responsible.”

According to SAE International, which sets voluntary engineering standards for the automotive and aerospace industries, there are five levels of automated driving.

For the first three levels, a driver must be behind the wheel at all times. Right now, you can only buy cars in Canada with Level 1 or Level 2 self-driving capabilities – which require you to keep your eyes on the road and be ready to intervene.

They’re not self-driving. Level 2 systems include GM’s Super Cruise and Tesla’s Autopilot. They use a combination of safety features already available in many new cars, including lane-keeping assist, adaptive cruise control and automatic emergency braking. Essentially, they’re fancy cruise control systems and don’t drive for you, even though some Level 2 systems can stop for red lights or change lanes on their own.

“While Tesla’s Autopilot feature allows a car to steer, accelerate and brake within its lane, it requires active driver supervision and doesn’t make the vehicle autonomous,” Lindsay Wilkins, a spokeswoman for the Insurance Corporation of British Columbia (ICBC), said in an email.

Could cars ever take responsibility?

Although Tesla sells “Full Self-Driving Capability” for $19,500, its website says that the features are Level 2 and “require active driver supervision and do not make the vehicle autonomous.”

This month, Tesla recalled more than 20,000 vehicles in Canada and nearly 363,000 vehicles in the United States. That was because of warnings from U.S. regulators that cars running the Full Self-Driving (FSD) Beta feature, could make unsafe manoeuvres and break traffic laws, including running red lights and breaking speed limits.

Last week in the United States, Tesla shareholders sued Tesla and its chief executive officer Elon Musk for false and misleading statements about Autopilot and FSD, which are being investigated there for their roles in several fatal crashes.

Tesla did not respond to requests for comment.

But when it comes to Level 3 systems, the car may be considered responsible in a crash.

In January, Mercedes became the first automaker to be allowed to operate cars equipped with Level 3 driver assist in the United States.

So far, it is only approved for use in Nevada, although Mercedes is seeking approval in California.

With Level 3, the car will be allowed to drive itself under limited conditions — in bumper-to-bumper traffic, for instance, when speeds are limited to less than 40 miles an hour (64 kilometres an hour) – without the driver needing to hold the steering wheel or look at the road. But – here’s the catch – the driver needs to be ready to respond within a few seconds whenever the system disengages.

Mercedes’ Drive Pilot system – available in the United States on the 2024 S-Class and EQS Sedan – lets you check emails and even watch YouTube videos or play Angry Birds while it’s engaged. But you can’t legally leave the driver’s seat or take a nap.

If the car gets into an at-fault crash while Level 3 automation is engaged, the car, not the driver, would be liable “anywhere in the world” it’s offered, Mercedes said last week.

“When you make the jump from Level 2 to Level 3, it is the computer that is driving, so the product is then responsible and then we are ultimately responsible – full stop and that’s it,” Mercedes-Benz chief executive officer Ola Kaellenius told reporters at the launch of the company’s upcoming new operating system in California last week. “If the computer were to cause the accident, we would have to pick it up.”

The system is not available in Canada and Mercedes did not immediately say whether it is seeking approval in any province.

The Insurance Bureau of Canada said it can’t speculate on who would be liable if a car crashed while in true self-driving mode.

“I can’t really comment too much because it has not been tried and tested yet in the industry,” Thomas said.

Have a driving question? Send it to globedrive@globeandmail.com and put ‘Driving Concerns’ in your subject line. Emails without the correct subject line may not be answered. Canada’s a big place, so let us know where you are so we can find the answer for your city and province.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe