U.S. opens probe into Tesla’s Autopilot over emergency car crashes By Reuters

© Reuters. FILE PHOTO: The Tesla logo is seen in Taipei, Taiwan, Aug. 11, 2017. REUTERS / Tyrone Siu

By David Shepardson and Hyunjoo Jin

WASHINGTON (Reuters) – U.S. auto safety officials announced Monday that they have initiated a formal safety investigation into Tesla’s (NASDAQ 🙂 Inc.’s Autopilot driver assistance system following a series of accidents involving emergency vehicles.

The National Highway Traffic Safety Administration (NHTSA) said it had identified eleven accidents since January 2018 in which Tesla models “encountered first-aid scenes and subsequently hit one or more vehicles involved in those scenes” .

Following an investigation, the NHTSA could decide not to take action or request a recall, which could effectively limit how, when and where the autopilot operates. Any restrictions could narrow the competitive gap between Tesla’s system and similar advanced driver assistance systems from established automakers.

The auto safety agency said it had reports of 17 injuries and one fatality in those accidents.

Tesla shares fell 3.6% in the course of the investigation.

The company did not immediately respond to a request for comment. Chief Executive Elon Musk has repeatedly defended autopilot, tweeting in April that “Tesla is now ten times less likely to have an accident with the autopilot activated than the average vehicle.”

NHTSA said the 11 crashes this year were four, most recently last month in San Diego, and initiated a preliminary evaluation of autopilot on Tesla models Y, X, S and 3 in 2014-2021.

“The vehicles involved were confirmed to be engaged in either autopilot or traffic-aware cruise control as they approached the accidents,” the NHTSA said in a document opening the investigation.

The investigation covers an estimated 765,000 Tesla vehicles in the United States, the NHTSA said at the opening of the investigation.

AFTER DARK

The NHTSA has deployed numerous dedicated crash investigation teams over the past few years to investigate a number of Tesla crashes.

Most of the 11 accidents occurred after dark and the accident scenes that occurred included measures such as emergency lights, torches or cones.

The NHTSA said its investigation will “evaluate the technologies and methods used to monitor, assist and enforce the driver’s involvement in the dynamic driving task during autopilot operation”.

Before the NHTSA can request a recall, it must first decide to upgrade a preliminary investigation to a technical analysis. The two-step investigation process often takes a year or more.

Autopilot, which does some driving duties and allows drivers to keep their hands off the steering wheel for long periods of time, has been in use in at least three Tesla vehicles involved in fatal U.S. accidents since 2016, the National Transportation Safety Board said ( NTSB) with.

The NTSB has criticized Tesla’s lack of system safeguards for the autopilot and NHTSA’s failure to ensure the safety of the autopilot.

In February 2020, Tesla’s director of autonomous driving technology, Andrej Karpathy, identified a challenge for its autopilot system: How do you know when the emergency flashing lights on a parked police car are turned on?

“This is an example of a new task that we would like to know about,” Karpathy said at a conference.

In one of the cases, a doctor was watching a movie on a phone when his vehicle rammed into a state trooper in North Carolina.

MAIN CONCERNS

Bryant Walker Smith, a law professor at the University of South Carolina, said the parked emergency crashes “seem really vivid and even tragic to illustrate some of the key concerns about the Tesla system.” He said it creates driver complacency and does not work in some untypical circumstances.

NHTSA, he suggested, “was way too deferential and shy, especially about Tesla.”

One of the eleven accidents reported by the NHTSA was a January 2018 accident involving a parked fire truck in California. NTSB said the system’s design “allowed the driver to disengage from the driving task” in the Culver City, California crash.

The NHTSA said Monday that since 2016 it has sent teams to review 31 Tesla crashes with 10 fatalities that are suspected of having advanced driver assistance systems in use. It excluded the systems in three of the crashes.

In a statement, the NHTSA reminded drivers that “no commercial motor vehicle these days are able to drive themselves … equipment on motor vehicles must be used properly and responsibly by drivers.”

Tesla and CEO Musk have grappled with US authorities on a variety of security issues over the years.

In February, Tesla agreed to recall 134,951 Model S and Model X vehicles with touchscreen displays that could fail and increase the risk of an accident after U.S. auto safety officials requested the recall.

NHTSA made a rare formal recall request to Tesla in January, saying other automakers had made numerous recalls about similar safety issues due to the touchscreen bug.

Musk said on Twitter last month that the automaker would hold Tesla AI Day on Thursday to “discuss progress with Tesla AI software and hardware, both training and conclusions. Purpose is recruitment.”

In January 2017, the NHTSA completed a preliminary assessment of Autopilot that included 43,000 vehicles without taking action after a nearly seven-month investigation.

The NHTSA said at the time it had “identified no design or performance defects” of Autopilot, “yet no incidents where the systems did not work as planned”.

The NHTSA has not had a Senate-approved administrator as of January 2017, and President Joe Biden has not nominated anyone for the post in nearly seven months.

Comments are closed, but trackbacks and pingbacks are open.