It’s called Autopilot, but even the company that makes it says the driver’s assistance features “do not make the vehicle autonomous.”
Yet, drivers using Tesla’s Autopilot on Ontario’s Highway 401 took their eyes off the road for “dangerous amounts of time,” a recent study shows.
“When Autopilot was on, drivers spent less time looking at the road and more time – sometimes double or three times as much – looking at the touchscreen or out the side windows,” said study author Francesco Biondi, an associate professor of kinesiology at the University of Windsor. “They became supervisors instead of drivers.”
It’s more evidence that semi-autonomous driver assistance systems like Autopilot – which use cameras and sensors to help the car stay in its lane, maintain speed and distance, and stop for obstacles – are “overpromising,” Biondi said. The automakers that offer them are not doing enough to make sure that drivers understand their limitations and pay attention behind the wheel, Biondi said.
In the study, which was funded by Ontario’s Ministry of Transportation, Biondi studied 30 volunteers as they drove a 2022 Tesla Model 3 with Autopilot both on and off on the highway between Windsor and Chatham, Ont.
“We measured their heart rates and tracked their eye movements – and it gave us a good idea of how much attention they were paying to the road,” Biondi said.
When the car was in manual mode for 40 minutes, drivers spent about a minute and 40 seconds in total looking at the touchscreen. When Autopilot was on for 40 minutes, drivers spent a total of three to eight minutes looking at the touchscreen.
Biondi said he expected drivers to be somewhat less focused on the road while using Autopilot, but he was surprised at how much their attention wandered.
While there were no crashes or near misses during the study, one driver started to doze off, Biondi said. “Traffic was pretty low [when we tested].”
Tesla’s website states that while using Autopilot, “it is your responsibility to stay alert, keep your hands on the steering wheel at all times and maintain control of your car.” When the car fails to sense the driver’s hands applying pressure to the wheel, Autopilot issues audible and visual warnings. Yet, the length of time before the car “nags” the driver varies anywhere from 30 seconds to three minutes, depending on the software update. And, in some instances, it may be longer.
In July, the U.S. National Highway Traffic Safety Administration (NHTSA) sent Tesla a letter asking how many cars had received a software update making it possible for drivers to use the car for extended periods without putting their hands on the wheel. It came after a hacker found a secret mode on a Tesla company car that turned off the hands-off warning.
Tesla didn’t respond to repeated requests for comment.
Last week, Tesla issued a recall of more than two million vehicles in the United States with autopilot to install new safeguards through over-the-air updates. The company has not said whether there will be a recall in Canada and, so far, there’s no Autopilot recall shown on Transport Canada’s recalls database.
When self-driving doesn’t mean self-driving
While Tesla uses the names Autopilot and Full Self-Driving for its driver-assistance systems, they are not considered autonomous by the Society of Automotive Engineers (SAE), which has defined five levels of driving autonomy. Only Level 5 systems are considered fully autonomous, and they don’t exist yet.
Nearly all driver-assistance systems on the road, including Tesla’s Autopilot, General Motors’ Super Cruise and Ford’s Blue Cruise, are rated Level 2, where the driver must have their hands on the wheel and eyes on the road.
In tests of driver assistance systems by Consumer Reports, Tesla’s ranked in the middle of the pack.
Level 2 systems use features including adaptive cruise control, lane-keep assist and automatic emergency braking to help a car to maintain speed, stay in its lane and stop for obstacles. While some Level 2 systems allow the car to switch lanes or stop for red lights, the driver is still considered responsible for the car’s actions.
Level 3 lets the car drive under certain traffic conditions – but the driver always has to be ready to take over. Level 4 systems don’t need a driver, but they only work on specific routes. Consumers can’t buy one, but robotaxi companies, such as Alphabet-owned Waymo and GM-owned Cruise, have been running them in some U.S. cities. Last month, Cruise halted driverless services after one of its robotaxis dragged a pedestrian about six metres.
Right now, only one automaker – Mercedes-Benz – offers cars in the United States with a Level 3 system. For now, the system, called Drive Pilot assist, is approved for use only in Utah and California. It hasn’t been approved by any Canadian province.
Mercedes has said that if Drive Pilot causes a crash while it’s active, the car company, and not the driver, will be responsible.
Some experts say Level 5 systems – truly driverless cars that can go anywhere – are still decades away.
Closed course test
In June, I got a chance to drive BMW’s Level 3 system – which the company has said it will start offering in Germany next month on some 7-Series vehicles – at the company’s Future Mobility Development Centre in Sokolov, Czech Republic, also known as Czechia.
Similar to the Mercedes system, BMW’s Level 3 system activates only if you’re travelling less than 60 kilometres an hour and are following another vehicle.
Unlike Autopilot or other Level 2 systems, the system is designed to let you take your hands off the wheel and your eyes off the road – a BMW engineer encouraged me to watch a show on the infotainment screen – but only when the system is active.
When we came up on a wooden pallet abandoned in our lane, the car came to a complete stop just in front of it – it didn’t drive around it – and warned me that it would be handing me back control in a few seconds.
This was okay on the closed test road – the other drivers all worked for BMW – but I wondered how quickly I’d have reacted and taken control if I were surrounded by traffic and I’d been focused on the screen.
Biondi is skeptical that such features will make roads safer. He said he thinks these systems will lead to drivers overestimating what their cars can actually do.
“This whole Level 1 to 5 system is confusing to people who work in this field, let alone everyday drivers,” he said. “Until we have full automation, where there’s no driver and cars are like trains and take you from here to there on a set route, all these Level 2 and Level 3 systems where humans are still responsible aren’t going to work.”
Certainly, drivers do not always understand the nuances. That may be because drivers think Tesla’s systems are self-driving – like their names suggest – despite Tesla’s disclaimers, Biondi said. That hasn’t been helped by years of promises from chief executive officer Elon Musk.
“There’s a lot of mixed messaging,” Biondi said.
For instance, in 2016, Musk said “I really consider autonomous driving a solved problem. … I think we are probably less than two years away.”
But now, there are at least nine active U.S. lawsuits against Tesla over fatal crashes involving Autopilot. According to The Washington Post, there have been more than 700 crashes and 19 deaths involving Teslas using Autopilot since it was introduced.
In October, Tesla won a U.S. civil lawsuit over allegations that Autopilot was responsible for a fatal 2019 crash. The company argued that drivers are ultimately responsible.