This week, Tesla committed to rolling out a remote update to 2 million vehicles, aiming to enhance driver attention specifically when Autopilot is engaged, especially on surface roads where the driver-assistance technology might not detect cross traffic and other potential hazards. However, this extensive recall—the largest in Tesla's 20-year history—was met with criticism from experts and lawmakers.
Matthew Wansley, a Cardozo School of Law professor specializing in emerging automotive technologies, expressed disappointment, stating, "I haven't seen Tesla or its defenders provide a compelling argument for allowing Autopilot on roads with potential cross traffic. Many of these accidents stem from this issue." Senator Richard Blumenthal, a regular Tesla critic, echoed this sentiment, calling the update "far from sufficient."
This recall follows more than two years since the National Highway Traffic Safety Administration (NHTSA) initiated an investigation into Autopilot due to a series of incidents where Teslas collided with parked emergency vehicles. The NHTSA reviewed over 900 crashes involving Autopilot and highlighted concerns about the Autosteer feature lacking adequate controls to prevent misuse outside of its intended use on controlled-access highways.
Despite Tesla's disagreement with the agency's findings, the company initiated remote software updates starting on Tuesday, as confirmed by the NHTSA.
Blumenthal emphasized that regulators should have mandated more substantial software changes given the history of crashes involving Tesla's Autopilot. Just days before the recall, The Washington Post conducted an investigation uncovering eight serious or fatal incidents occurring on roads where Autopilot wasn't intended for use. Tesla itself has acknowledged in various documents and communications with federal regulators that Autosteer is solely designed for "controlled-access highways" with specific road conditions.
Expressing concern about relying on self-regulation, Blumenthal criticized Tesla's approach to recalls and Elon Musk's comments, viewing them as treating recalls more as entertainment than enforcement. He highlighted the need for more proactive measures in preventing accidents beyond voluntary compliance.
Some officials and lawmakers voiced apprehension that the National Highway Traffic Safety Administration (NHTSA) might have been hesitant to take firmer action against Tesla due to its devoted consumer base and significant influence in the shift toward electric vehicles, a priority for the Biden administration. Despite ongoing NHTSA investigations, there's hope among Tesla critics that the recall might not mark the end of the regulatory actions.
NHTSA's spokeswoman, Veronica Morales, stressed Tesla's legal responsibility to offer a free remedy that comprehensively addresses the safety issue. Tesla, however, did not respond to requests for comment.
The Post's investigation revealed that Autopilot could be engaged on various roads, including those with intersections, stoplights, and cross traffic, leading to fatal accidents. As part of the recall, Tesla committed to issuing a software update with enhanced controls and alerts for activating features outside designated highways, but the company didn't specify restricting the technology to its intended operational design domain (ODD).
Critics, like Dan O'Dowd from the advocacy group the Dawn Project, argued that the recall failed to tackle the underlying safety defects in Tesla's self-driving software, suggesting that banning the flawed software would be a more effective solution than increased monitoring.
Jennifer Homendy, chair of the National Transportation Safety Board (NTSB), expressed some satisfaction that action is being taken by NHTSA, albeit seven years following the first known Autopilot fatality.
"While it's positive to see action taken, it's essential to acknowledge lives lost during this time," Homendy emphasized. She raised concerns about verifying the effectiveness of the changes made during a voluntary recall and questioned the process of ensuring compliance.
NHTSA's Morales indicated that the agency plans to assess multiple Tesla vehicles in Ohio to gauge the effectiveness of the remedies implemented.
Tesla and Elon Musk have argued against labeling fixes through software updates as "recalls," deeming the term outdated. However, past recalls have effectively enforced updates that might not have occurred otherwise. Despite a brief dip in Tesla's stock following the recall announcement, investors recognized that the recall wouldn't significantly impact Tesla's business, resulting in a more than 4 percent increase in the company's stock by week's end.
Gene Munster from Deepwater Asset Management doesn't anticipate this recall deterring Tesla from aggressively pursuing Musk's vision of full autonomy. He believes people will continue using Autopilot and doubts that NHTSA's notifications will noticeably improve road safety or impede Tesla's ambitious goals.
Representative Anna G. Eshoo, whose district includes Tesla's engineering headquarters, described the recall as "jaw-dropping." She noted that even if it mainly adds extra notifications, it serves the purpose of alerting drivers to the limited autonomy of Autopilot.
Homendy highlighted NTSB's consistent findings of issues with Tesla's approach to driver assistance, particularly regarding fatal crashes involving Autopilot in various locations. NTSB had recommended action back in 2017 to prevent Autopilot engagement outside its intended conditions. Homendy expressed skepticism about addressing the problem solely through warning systems or precautionary checks. While other automakers incorporate driver-assistance software, Tesla's Autopilot-related crashes have faced continuous scrutiny from federal agencies.
"We've consistently identified problems with Tesla," Homendy concluded, highlighting the distinctive challenges surrounding Tesla's technology compared to other advanced driver-assistance systems investigated by NTSB.
----------This article is partly excerpted from Reuters.