Home United States United Kingdom Australia Saudi Arbia United Arab Emirates India Flights Hotels Cloud Services FareArena Url Shortener Contact Us Advertise More From Zordo

Tesla Under Scrutiny Over Lack of Recall After Autopilot Update

1 month ago 13

Business|U.S. regulator questions Tesla on the lack of a recall after an update to Autopilot.

https://www.nytimes.com/2021/10/13/business/tesla-autopilot-recall-safety.html

Teslas operating with Autopilot, a driver-assistance system, have sometimes failed to stop for emergency vehicles that had their lights flashing.
Credit...Roger Kisby for The New York Times
  • Oct. 13, 2021Updated 4:55 p.m. ET

The top federal auto safety regulator sent two letters to Tesla this week raising questions about the company’s driver-assistance software systems and instructing the carmaker to provide fuller information.

The regulator, the National Highway Traffic Safety Administration, is looking into why Tesla did not issue a recall last month when it updated software called Autopilot to improve its ability to spot stopped emergency vehicles such as police cars and fire trucks.

The agency also ordered Tesla to provide data about the software that the company calls Full Self-Driving and expressed concern that Tesla may be preventing customers from sharing safety information with the agency.

The moves suggest that NHTSA is taking a closer look at Tesla’s driver-assistance features and the gap between their names and their abilities.

“I appreciate now that NHTSA is taking some steps forward, but it should have happened before,” Jennifer Homendy, chair of another federal agency, the National Transportation Safety Board, said in a recent interview. “It needs to happen more quickly, because otherwise you risk people’s lives.”

The safety board investigates the causes of automobile, train, airplane and other transportation accidents but has no regulatory power over manufacturers, as NHTSA does.

Concern about Autopilot — a system of cameras and other sensors that can steer, brake and accelerate with little input from a driver — has been growing because the technology sometimes fails to detect objects or other vehicles. Despite its name, Autopilot does not enable autonomous driving, and Ms. Homendy’s agency has said the technology lacks safeguards to ensure that drivers remain alert and in control.

Full Self-Driving is a more advanced system that Tesla has allowed a small set of owners to test on public roads. But it, too, is not able to pilot a car without active engagement by a human driver.

In August, NHTSA opened a formal investigation into 12 crashes in which Tesla cars operating in Autopilot mode failed to detect stopped emergency vehicles that had their lights flashing in low light. One accident killed a passenger. Other Autopilot crashes have accounted for 10 deaths since 2016, according to data compiled by NHTSA.

Tesla and its chief executive, Elon Musk, have said Autopilot is not flawed, insisting that it makes cars much safer than others on the road, and they have dismissed criticism of the company’s design process. But NHTSA is now questioning whether Tesla’s software refinements sidestep regulatory scrutiny.

Normally, automakers issue recalls and owners take their cars to dealers for repairs or updates. But Tesla can modify its cars by sending them software updates over the internet.

In a letter on Tuesday, NHTSA reminded Tesla that federal law requires automakers to initiate formal recalls if they find defects that pose a safety risk, so that both owners and NHTSA are informed of the fixes.

“Any manufacturer issuing an over-the-air update that mitigates a defect that poses an unreasonable risk to motor vehicle safety is required to timely file an accompanying recall notice to NHTSA,” the agency said in one letter to Tesla.

NHTSA told the company to provide detailed information on a software update, sent in late September, that modified Autopilot and enhanced its ability to detect emergency lights.

The letter told Tesla to state whether it intends to issue a recall related to the update and, if not, any legal or technical reasons that it declines to do so.

That letter was sent by Gregory Magno, the chief of NHTSA’s vehicle defects division in its office of defects investigation, to Eddie Gates, Tesla’s director of field quality.

In a separate letter to a senior Tesla legal officer, NHTSA ordered the company to disclose the number of owners who have been given Full Self-Driving software as part of a beta test, to provide copies of any nondisclosure agreements it has had those testers sign and to explain whether the terms would prevent owners from reporting any safety concerns to NHTSA.

Because consumers are an important source of information to the agency, “any agreement that may prevent or dissuade participants in the early-access beta release program from reporting safety concerns to NHTSA is unacceptable,” the agency said. “Moreover, even limitations on sharing certain information publicly adversely impacts NHTSA’s ability to obtain information relevant to safety.”

Tesla did not respond to emails requesting comment for this article.

Read Entire Article