Business

Tesla recalls 2 million cars with ‘insufficient’ Autopilot safety controls

[ad_1]

Tesla is recalling more than 2 million vehicles to fix Autopilot systems that U.S. safety regulators determined did not have enough controls to prevent misuse, the largest recall of Tesla’s driver-assistance software to date.

The National Highway Traffic Safety Administration said Tesla’s method of ensuring drivers are still paying attention while the driver-assistance system is activated is “insufficient.”

17 fatalities, 736 crashes: The shocking toll of Tesla’s Autopilot

“There may be an increased risk of a crash,” the agency wrote, in some situations when the system is engaged “and the driver does not maintain responsibility for vehicle operation and is unprepared to intervene as necessary or fails to recognize when Autosteer is canceled or not engaged.”

NHTSA said Tesla will send out a software update to fix the problems affecting its 2012-2023 Model S, 2016-2023 Model X, 2017-2023 Model 3, and 2020-2023 Model Y vehicles, effectively encompassing all Tesla vehicles equipped with Autopilot on U.S. roads.

Tesla did not immediately respond to requests for comment early Wednesday.

In a statement this week responding to a Washington Post report that detailed the company’s failure to limit Autopilot to locations and conditions for which it was designed, Tesla said it has a “moral obligation” to continue improving its safety systems, while adding that it’s “morally indefensible” to not make these features available to a wider set of consumers.

“Tesla looks forward to continuing our work with them toward our common goal of eliminating as many deaths and injuries as possible on our roadways,” reads the company’s post on X, the platform formerly known as Twitter.

The story reported Tesla’s acknowledgments, based on user manuals, legal documents and statements to regulators, that the key Autopilot feature called Autosteer is “intended for use on controlled-access highways” with “a center divider, clear lane markings, and no cross traffic.” Despite that, drivers managed to activate Autopilot in locations other than those intended for the software — at times with deadly consequences.

In its recall notice, NHTSA said “Autosteer is designed and intended for use on controlled-access highways when the feature is not operating in conjunction with the Autosteer on City Streets feature,” a more advanced version known as Full Self-Driving.

“In certain circumstances when Autosteer is engaged, the prominence and scope of the feature’s controls may not be sufficient to prevent driver misuse of the SAE Level 2 advanced driver-assistance feature,” the recall notice said.

Tesla typically addresses NHTSA software recalls through remote updates, meaning the vehicles would not have to be returned to service centers to meet the agency’s requirements.

The investigation will remain open “to support an evaluation of the effectiveness of the remedies deployed by Tesla,” NHTSA said.

Why Tesla Autopilot shouldn’t be used in as many places as you think

The Washington Post report revealed at least eight fatal or serious wrecks involving Tesla Autopilot on roads where the driver assistance software could not reliably operate, according to an analysis of two federal databases, legal records and other public documents.

The recall comes after a years-long investigation into crashes while the Autopilot system was activated. According to a timeline released by NHTSA, Tesla cooperated with repeated inquiries starting in August 2021, concluding in a series of meetings in early October 2023. In those meetings, Tesla “did not concur” with the agency’s safety analysis but proposed several “over-the-air” software updates to address the issue.

When Autopilot is activated, the driver is still considered the “operator” of the vehicle. That means they are responsible for the vehicle’s movement with their hands on the steering wheel at all times and must be attentive to their surroundings at all times, ready to brake.

In a related safety recall report, NHTSA said the risk of collision can increase if the driver fails to “maintain continuous and sustained responsibility for the vehicle,” or fails to recognize when Autopilot turns off.

The software update, which was to be deployed on “certain affected vehicles” starting Tuesday, Dec. 12, will add extra controls and alerts to “encourage the driver to adhere to their continuous driving responsibility,” the recall report said. The update will also include controls that prevent Autosteer from engaging outside of areas where it is supposed to work, and a feature that can suspend a driver’s Autosteer privileges if they repeatedly fail to stay engaged at the wheel.

The company’s stock fell 1.1 percent in premarket trading, even as broader stock market indexes were up slightly.



[ad_2]

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button