Skip to main content
Open this photo in gallery:

The autonomous driving features in Tesla vehicles have sometimes mistaken overpasses and bridges for objects to avoid, leading to them braking heavily in the middle of the highway.David Paul Morris/Bloomberg

Since Tesla Inc. took a step back from radar as part of its advanced driving assist system (ADAS) all eyes are on how well its cameras will be able to “see” when the automaker’s vehicles drive themselves.

The U.S. National Highway Traffic Safety Administration opened an investigation in August, 2021 to determine why drivers using Tesla’s Autopilot driver assistance system have collided with stopped emergency vehicles at least 11 times since 2018.

Prior to the investigation getting under way, Tesla announced it was removing radar on all Model 3 and Model Y vehicles in the North American market, starting in May, 2021. Instead, the electric vehicle maker said Autopilot and Traffic Aware Cruise Control would run on what the company is calling “Tesla Vision,” a system solely using cameras, except for the more luxurious Model S and Model X, which will retain their existing radar systems.

It was a surprising move that raised questions about whether a purely camera-based system could offer the same level of safety and performance. While there is no consensus among experts, some vehicle camera system vendors believe they have the cutting-edge technology needed to close the gap and avoid collisions, particularly with stationary vehicles or objects.

Nodar is a Massachusetts-based startup working on Hammerhead 3D, its own proprietary camera system named for the hammerhead shark, the animal considered to have the best depth perception because of how far apart its eyes are situated. By developing a 3D camera system that sees farther and more clearly, it could help autonomous driving systems better assess the environment up ahead, says Brad Rosen, Nodar’s chief operating officer.

“We’re looking for the longest distance between cameras, and that farthest baseline on a car would be the sideview mirrors,” Rosen says. “We talk a lot about that with the trucking industry because it’s about three metres between mirrors, allowing us to resolve distance with great accuracy up to about 1,000 metres ahead.”

It’s time to stop Tesla from using us as lab rats for its Full Self-Driving tests

Panasonic unveils prototype battery designed to help Tesla lower EV production costs

Tesla places the three cameras in its tri-focal system close to one another just behind the windshield, in a setup similar to how manufacturers equip smartphones with long-range telephoto, wide-angle and ultra-wide-angle lenses. While Rosen has not examined the Tesla system himself, he says that cameras that sit too close together find it harder to accurately gauge longer distances. Apart from aligning cameras with the sideview mirrors, he says placing cameras near the headlights can also work well, particularly for off-road vehicles.

The challenge, he says, is that stereo vision cameras spread farther apart present their own challenges, he says. They can potentially fall out of alignment owing to road vibrations and temperature shifts in the chassis; this poses a problem when the cameras need to maintain an alignment within one-100th of a degree. Subaru’s EyeSight and the Drive pilot system in Mercedes’ EQS, among others, use stereo vision systems deployed in tighter formations to negate that – though they do also work in tandem with radar.

Some industry pundits speculated in 2020 that Tesla chief executive officer Elon Musk was about to strike a deal with Tel Aviv-based Arbe Robotics Ltd. to use its 4D imaging “Phoenix” radar system, which provides higher-resolution detection at longer range at a rate of 50 times per second. That didn’t materialize, but the Israeli startup is still working on making its system see much farther than Tesla’s radar can. Tesla vehicles use a radar setup from Continental Automotive that gauges distance up to 160 metres, whereas Arbe claims its Phoenix system can reach up to 300 metres, almost doubling the range.

“It can do that, regardless of weather conditions and availability of light,” says Kobi Marenko, Arbe’s CEO. “We have a strong layer of artificial intelligence (AI), so we’re able to classify the objects based on the radar signature to know if it’s a dog, human being, bicycle, motorcycle or car.”

Marenko doesn’t see Arbe’s technology supplanting cameras, suggesting radar systems work best in conjunction with camera systems. “If you take the free space mapping based just on radar, and then you take the cameras – both functioning independently – and use a deep learning application that makes decisions based on data from those two sensors, I think that can solve the problem,” he says.

Identifying objects is key to making that happen, he adds. The autonomous driving features in Tesla vehicles have sometimes mistaken overpasses and bridges for objects to avoid, leading to them braking heavily in the middle of the highway. Radar and cameras would need to “see” what’s coming from greater distances to alert the driver and act accordingly.

Sam Abuelsamid, principal analyst at market research firm Guidehouse Insights, has been following the electric mobility space, and believes Tesla may be planning a long-term strategy that relies on camera technology alone.

“To do a robust automated driving system, you really need to have multiple types of sensors to get full coverage, and also give you redundancy and diversity for safety because none of them are sufficient by themselves,” Abuelsamid says.

He notes that it was probably a wiser decision for Tesla to abandon “cheap radar” and either upgrade or add better radar sensors, similar to what General Motors has done with its Super Cruise autonomous driving system, and Ford is doing with its own Blue Cruise system. In lieu of radar, Tesla would have to change how the cameras are configured to offset the loss of any radar capabilities, he adds.

“From a safety perspective, there’s a fundamental difference that happens when you go from driver assist systems like we have today to an autonomous system,” he says. “Part of the reason why we need multiple different types of sensors is to provide that fail operational capability. You ideally want a minimum of three types of sensors (camera, radar, LiDAR), so that if there’s a disagreement with one of them, or all three disagree, then you know that there’s a problem somewhere. If you go with a camera-only solution, you lose that operational capability.”

Tesla did not respond to requests for comment. The company notoriously did away with most of its public relations department last year and has refrained from responding to journalists’ enquiries since.

Shopping for a new car? Check out the Globe Drive Build and Price Tool to see the latest discounts, rebates and rates on new cars, trucks and SUVs. Click here to get your price.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe