Tesla is issuing a recall for the Autopilot software in a whopping 2 million vehicles.
Tesla's renowned technological prowess took a hit recently as the company, compelled by regulatory pressure, issued a recall affecting over two million vehicles. U.S. officials indicated Tesla had not sufficiently ensured driver attentiveness while utilizing a system capable of steering, accelerating, and braking cars automatically.
This marked the fourth recall by Tesla in less than two years, encompassing nearly all U.S.-manufactured vehicles since 2012, including their popular Model Y SUV. Despite Tesla's significant market share in electric vehicles, competition from General Motors, Hyundai, Ford, and others has eroded its dominance. Additionally, controversial public remarks by CEO Elon Musk, widely interpreted as offensive, have unsettled some customers, further denting the company's image.
According to Gary Black of the Future Fund, who typically holds a positive view of Tesla, the brand has undoubtedly suffered this year. The recall stems from an inquiry into Tesla's driver-assistance system, Autopilot, initiated by the National Highway Traffic Safety Administration following accidents, some fatal, involving the technology. Autopilot is meant to autonomously control vehicles on highways, with Tesla's manuals advising drivers to remain vigilant and ready to intervene.
Regulators' concerns center around Tesla's alleged failure to prevent misuse of the system, such as activating it on local roads or becoming distracted, assuming the car could drive itself. Autopilot encompasses various features allowing the car's technology to assume control to different extents. The recalled feature, Autosteer, maintains a car in its lane without driver input, posing a potential risk of crashes if drivers don't maintain responsibility.
To address this issue, Tesla plans wireless updates for its cars, introducing more conspicuous visual alerts and checks when Autosteer is engaged, reminding drivers to remain attentive and keep their hands on the wheel. Elon Musk has not provided comments on this matter despite requests for input.
Tesla contested the agency's evaluation of its system, affirming that Autopilot enhances vehicle safety. Pete Buttigieg, the transportation secretary overseeing the auto safety agency, stressed manufacturers' responsibility in preventing foreseeable misuse of their technology, likening it to the necessity of recalling a car even if equipped with airbags, should the airbag malfunction.
Tesla defended its stance on the X platform, emphasizing the moral imperative of expanding access to these systems based on data demonstrating their life-saving and injury-preventing capabilities. However, doubts about Autopilot's safety benefits persist among some experts. Philip Koopman, an associate professor at Carnegie Mellon University specializing in self-driving software, suggested that the primary safety enhancement in vehicles comes from widespread features like automatic emergency braking, not Autopilot, which he labeled as a convenience feature rather than a safety measure.
The ongoing investigation by regulators reflects a broader struggle between government oversight and companies developing self-driving technologies. This tussle gained attention when California regulators compelled Cruise, a subsidiary of G.M., to halt its driverless taxi service following several traffic incidents, including one involving a pedestrian dragged by a Cruise car after a collision. Buttigieg highlighted concerns about companies rushing to deploy self-driving tech before ensuring its safety, partly driven by the alarming number of highway accidents and fatalities.
Despite a 4.5% decline in traffic-related deaths in the first nine months of the year compared to the prior year, the total number of fatalities still surpasses figures from 2013 by about 6,000, amounting to over 30,000 lives lost.
The culture within many of these companies seems fixated on reaching the envisioned destination quickly. The rationale often revolves around the notion that the sooner they develop and deploy this technology, the better off everyone will be, given the questionable safety record of human drivers," he commented.
The National Highway Traffic Safety Administration initiated an investigation in August 2021 into 11 incidents involving Tesla vehicles operating with Autosteer engaged. Subsequent meetings between the agency and Tesla resulted in the company voluntarily conducting a recall this month.
Throughout the investigation, the safety agency scrutinized 956 crashes involving Autopilot engagement, narrowing its focus to 322, including frontal collisions and scenarios where Autopilot might have been inadvertently activated.
Tesla initiated wireless software updates for certain vehicles this week, with plans to update the remaining ones later. Leveraging cellular networks, Tesla has routinely updated car software, often overnight while parked. Depending on a vehicle's hardware, updates will introduce more conspicuous visual alerts and additional checks during Autosteer use. Notably, the feature may suspend if drivers consistently fail to use it responsibly.
Letters notifying Tesla owners of the update are slated for mailing in February.
Tesla's recent recall adds to the growing public scrutiny of the automaker. In October, a California jury ruled that the company's driver-assistance software wasn't responsible for a fatal crash, but similar cases are undergoing litigation nationwide. One such case involves a 2019 Florida crash where a Tesla operating on Autopilot failed to stop at a stop sign, resulting in a fatality and severe injuries.
According to attorney Todd Poses, representing the family involved in the Florida crash, the recall underscores Tesla's awareness of Autopilot's usage on unsafe roads and its failure to restrict its activation.
Survivor Dillon Angulo, who suffered severe injuries, condemned the technology's safety and urged its removal from roads.
Tesla has encountered prior recalls: China ordered a recall of 1.1 million vehicles due to acceleration and braking issues, while U.S. regulators flagged Tesla's Full Self-Driving system for increasing accident risks. Concerns arose regarding the system allowing vehicles to exceed speed limits and navigate intersections in unpredictable ways.
Earlier, Tesla recalled 54,000 cars to disable a feature allowing slow rolling through intersections. Despite separate sales, Full Self Driving and Autopilot share foundational technologies. The latest recall aims to alert drivers when using Autopilot in non-intended areas, but the allowance for its use in these contexts remains uncertain.
Legal expert Matthew Wansley acknowledges the NHTSA's focus on critical issues but emphasizes that the effectiveness of these actions hinges on specifics and implementation.
---------This article is partly excerpted from NPR.