Skip to main content
opinion
Open this photo in gallery:

Self-driving cars require a lot of real-world testing, much of which needs to happen on public roads.David Paul Morris/Bloomberg

It might not be too ludicrous to describe current users of Tesla’s controversial Full Self-Driving (FSD) system as lab rats.

The company began rolling out a beta version of FSD to a larger group of customers early last week. The version being deployed currently appears to be rough and unfinished, and despite it not actually being “fully self-driving,” more vehicles equipped with the feature will soon be on city streets and busy highways, around unsuspecting cyclists, pedestrians and other road users. So, in a sense, maybe we’re all a bit like lab rats in this test.

In an email to owners, Tesla said cars with the new FSD Beta 10.2 software, “may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road.” So far, only owners who request to join the beta testing program and then achieve a perfect safety score, as determined by Tesla, can use the experimental new feature. It doesn’t appear to be available in Canada yet, and the company did not respond when asked if and when it plans to roll out the latest FSD beta north of the border.

If or when that happens, we should remember that the roads are ours, and if Tesla wants to use them as a testing ground for its dubious new technology, the company should be required to show us – and regulators – that it’s reasonably safe.

Videos posted online show Tesla’s self-driving system is at times competent, occasionally clever, but also shaky, confused and sometimes even dangerous. In one instance, the car swerves toward an oncoming pedestrian for no apparent reason. Drivers frequently have to step in to save the car from itself. Earlier FSD beta participants had to sign a non-disclosure agreement and were encouraged to share fewer videos, as Vice News reported.

None of this takes away from the fact Tesla has, for most of the last 10 years, been making the best electric vehicles on the road. The company almost singlehandedly made EVs desirable. It’d be a shame to squander that reputation trying to make FSD happen.

Self-driving cars require a lot of real-world testing, much of which needs to happen on public roads. Other companies working on high-level automated vehicles – like Waymo, Aurora and General Motors subsidiary Cruise – typically hire teams of trained, paid drivers whose job is to test and monitor experimental cars. And, even those drivers can make fatal mistakes, as was the case in 2018 when an Uber test vehicle struck and killed a pedestrian.

Why do people have such an unreasonable hatred towards Teslas on the road?

BMW is catching Tesla with the 2022 iX electric SUV

The problem is well documented; it’s difficult to pay attention to the road in a car that appears to be driving itself. Drivers using Tesla’s Autopilot looked at the road less, according to a study published earlier this year conducted by researchers at the Massachusetts Institute of Technology. So, using regular customers to beta test your even-more-ambitious self-driving system – one meant for use in crowded cities – is a strange choice. In fact, it sounds like a recipe for disaster.

Customers are understandably excited to take FSD for a spin, since some paid $10,000 or more for the much-hyped optional feature and have been waiting a long time to use it. “All you will need to do is get in and tell your car where to go,” the company claims on its website, beneath a whole host of caveats. Some owners have been so eager to try out the beta they’ve been coasting through stop signs and accelerating through yellow lights just to avoid heavy braking and earn a perfect safety score from Tesla, Consumer Reports noted, which is a whole other problem.

It’s no surprise then that top U.S. safety officials are concerned about the FSD rollout. California’s Department of Motor Vehicles is reviewing whether the company’s use of the term “full self-driving capability” violates state regulations. The National Highway Traffic Safety Administration ordered Tesla to provide information about its FSD software. And Jennifer Homendy, head of the National Transportation Safety Board, recently told The Wall Street Journal that Tesla should address existing safety deficiencies with the company’s technology before rolling out a city-driving feature, which is included in the FSD beta. She also, “expressed concern about how Tesla software is tested on public roadways.”

As pointed out in an academic article published by legal-news site Jurist, the problem is that Tesla wants to have it both ways. When convenient, the authors note, Tesla promotes the view that the FSD beta is merely a partially automated driving system, meaning that a human driver is still primarily in control (Such systems are classified as SAE level 2 according to the industry-agreed-upon system classifying automated vehicle technology). If FSD is ultimately intended as such, the feature shouldn’t be called “Full Self-Driving,” and the company should probably offer refunds to customers who’ve been misled by the marketing or Elon Musk’s comments. Or, if FSD is intended as a more advanced conditionally or highly automated driving system – SAE level 3 or 4 – it should be subject to much more regulatory scrutiny given what the technology is promising to do.

Whatever FSD is supposed to be, there’s no way Tesla should be testing a half-baked version by giving it to customers to play with on public roads where the stakes are life and death. It’s crazy this even has to be spelled out. Meanwhile, Elon Musk has tweeted more customers could be getting access to the beta this week.

Shopping for a new car? Check out the new Globe Drive Build and Price Tool to see the latest discounts, rebates and rates on new cars, trucks and SUVs. Click here to get your price.

Follow related authors and topics

Authors and topics you follow will be added to your personal news feed in Following.

Interact with The Globe