Tesla has issued a recall of over 362,000 vehicles to update its Full Self-Driving (FSD) Beta software—an advanced driver-assistance system. The company made the decision after federal safety regulators warned on Thursday February 16, 2023 that the system could allow the vehicles to act unsafe around intersections which may lead to crashes.
According to the statement from the National Highway Traffic Safety Administration (NHTSA), the software allows vehicles to “exceed speed limits or travel through intersections in an unlawful or unpredictable manner, increasing the risk of a crash.”
Tesla will try to fix the problem by releasing an over-the-air (OTA) update for free. The EV maker said it has no record of death or injuries related to the recall issue. However, it has recorded 18 warranty claims.
As a standard, Tesla vehicles come with a driver-assistant system called Autopilot. However, customers can install FSD for an additional $15,000—which CEO Elon Musk promised will one day have fully autonomous driving capabilities.
Vehicles affected by the recall include 2020-2023 Model Y, 2017-2023 Model 3, Model X, and 2016-2023 Model S equipped with FSD Beta software or those pending installation. Thousands of Tesla customers use FSD Beta.
Although Tesla complied with the NHTSA recall recommendation, the automaker said it did not agree with NHTSA’s analysis. This is a rare federal regulators intervention in a real-world testing program that Tesla views as vital to the development of self-driving cars.
Tesla noted that in “certain rare circumstances… the feature could potentially infringe on local traffic laws or customs while executing certain driving maneuvers.”
The recall is a huge blow to Tesla’s upcoming March 1 investor day. At the event, Elon Musk, the Chief Executive of Tesla is expected to promote the company’s artificial intelligence ability as well as highlight plans to expand its vehicle roster.
NHTSA is investigating close to a million vehicles
While Tesla’s FSD is not fully autonomous, it has several features like an active guidance system that drives the car from a highway on-ramp to off-ramp, Navigation on Autopilot, parking feature Summon, as well as the ability to make lane changes. The system is also designed to recognize stop signs and traffic lights and react to them.
In 2021, NHTSA opened an investigation into 830,000 Tesla Vehicles that offer a driver assistance system called Autopilot following a string of crashes with parked emergency vehicles. NHTSA is inspecting whether the EVs have mechanisms that make sure that drivers are paying attention.
Regardless of the recall advisory, NHTSA said “investigations into Tesla’s Autopilot and associated vehicle systems remains open and active because the recall doesn’t address the full scope of the NHTSA’s EA22-002 investigation into Tesla’s Autopilot and associated vehicle systems.”
According to NHTSA, some of the potential risks observed with FSD include traveling straight through an intersection while running on a turn-only lane, entering a stop sign-controlled intersection without completely halting, or continuously moving through an intersection with a steady yellow traffic signal without proper caution.
“The system may respond insufficiently to changes in posted speed limits or not adequately account for the driver’s adjustment of vehicle’s speed to exceed posted speed limits,” NHTSA said.
NHTSA discovered the problem during engineering analysis and testing of the system. They found that Tesla’s Autosteer on City Streets feature posed an unreasonable risk to vehicle safety due to selective adherence to traffic safety laws.
In 2022, Tesla issued a similar recall of 54,000 US vehicles with FSD Beta software because it allowed some of the models to perform “rolling stops” instead of coming to a complete stop at some intersections. NHTSA said it was a potential safety risk.
Experts warn against trusting self-driving cars
Experts have been talking about the potential danger with self-driving cars and continue to warn users to always be on alert. This message was reechoed by Pete Buttigieg, U.S. Transportation Secretary following the latest Tesla recall.
“There is enormous safety potential in the future of some of these automated driving technologies,” Buttigieg said on Yahoo Finance Live. “But any technology that is on the market today is something that is designed to supplement—not replace—your attention as a safe driver. I am concerned about any scenario where any driver thinks otherwise.”
Tesla officially began to offer full self-driving software to drivers in late 2022. However, the Beta mode has been on for nearly a year. Nevertheless, Tesla is not the only automaker that is pushing for autonomous driving technology. General Motors and Mercedes-Benz are aggressively working towards it.
“We are a long way off from automated vehicles, where you can sit back, take a nap, or read the paper while it takes you from point A to point B,” said Buttigieg.
The same sentiments were shared by Missy Cummings, an engineering professor at George Mason University and former advisor at NHTSA, in a post published by New York Times a day before Tesla issued the recall.
Cummings was worried that people could get too comfortable in autonomous driving vehicles and start “over-trusting the technology”. For years, she has been an advocate of more strict regulations on autonomous driving vehicles.
Her views about self-driving vehicles have led to several clashes with Musk on Twitter. The Tesla CEO once said she was “extremely biased against Tesla”. However, Cummings insists that she was simply against the irresponsible deployment of new technologies.
“[People] are letting the cars speed, and they are getting into accidents that are seriously injuring them or killing them,” Cummings said. “The technology is being abused by humans. We need to put in regulations that deal with this.”
From July 2021 to May 2022, the United States recorded close to 400 crashes involving vehicles equipped with self-driving technology as revealed by the NHTSA—and 273 of them were Tesla vehicles. At least six deaths and five grave injuries were recorded in those crashes. Federal regulators continue to monitor the safety of the technology.