Tesla Autopilot under investigation by US traffic agency after crashes with emergency vehicles

A US government agency is investigating Tesla after a series of accidents involving its so-called "Autopilot" self-driving technology
A US government agency is investigating Tesla after a series of accidents involving its so-called "Autopilot" self-driving technology Copyright AP
By Euronews and AP
Share this articleComments
Share this articleClose Button

US authorities have opened a formal investigation into Tesla's Autopilot self-driving tech after crashes involving emergency vehicles.


The US government has opened a formal investigation into Tesla's Autopilot partially automated driving system after a series of collisions with parked emergency vehicles.

The investigation covers 765,000 vehicles, almost everything that Tesla has sold in the US since the start of the 2014 model year. Of the crashes identified by the National Highway Traffic Safety Administration (NHTSA) as part of the investigation, 17 people were injured and one was killed.

NHTSA said it had identified 11 crashes since 2018 in which Teslas on Autopilot or Traffic Aware Cruise Control hit vehicles at scenes where first responders were using flashing lights, flares, an illuminated arrow board, or cones warning of hazards. 

The agency announced the action on Monday in a posting on its website.

The investigation covers Tesla's entire current model lineup, the Models Y, X, S and 3 from the 2014 through to 2021 model years.

The United States' National Transportation Safety Board (NTSB), which also has investigated some of the Tesla crashes, has recommended that NHTSA and Tesla limit Autopilot’s use to areas where it can safely operate.

The NTSB also recommended that NHTSA require Tesla to have a better system to make sure drivers are paying attention. NHTSA has not taken action on any of the recommendations. The NTSB has no enforcement powers and can only make recommendations to other federal agencies such as NHTSA.

AP Photo
Still frame from video provided by KCBS-TV shows a Tesla Model S electric car that crashed into a fire engine in California in 2018AP Photo

Driver misuse

Autopilot has frequently been misused by Tesla drivers, who have been caught driving drunk or even riding in the back seat of their cars.

The agency has sent investigative teams to 31 crashes involving partially automated driver-assist systems since June 2016. 

Such systems can keep a vehicle centred in its lane and a safe distance from vehicles in front of it. Of those crashes, 25 involved Tesla Autopilot in which 10 deaths were reported, according to data released by the agency.

Tesla and other manufacturers warn that drivers using the systems must be ready to intervene at all times. Teslas using the system have crashed into semis crossing in front of them, stopped emergency vehicles and a roadway barrier.

AP contacted Tesla, which has disbanded its media relations office, for comment.

The crashes into emergency vehicles cited by NHTSA began on January 22, 2018 in Culver City, near Los Angeles, California, when a Tesla using Autopilot struck a fire engine that was parked partially in the travel lanes with its lights flashing. The crew were handling another crash at the time.

Since then, the agency said multiple similar crashed had occurred across the United States.

AP Photo
The driver of this Tesla which hit a fire truck in Utah in 2018 later sued the companyAP Photo

"The investigation will assess the technologies and methods used to monitor, assist and enforce the driver's engagement with the dynamic driving task during Autopilot operation," NHTSA said in its investigation documents.

Reluctant to regulate

In addition, the probe will cover object and event detection by the system, as well as where it is allowed to operate. NHTSA says it will examine “contributing circumstances” to the crashes, as well as similar crashes.

An investigation could lead to a recall or other enforcement action by NHTSA.

"NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves," the agency said in a statement. "Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles".


The agency said it has "robust enforcement tools" to protect the public and investigate potential safety issues, and it will act when it finds evidence "of non-compliance or an unreasonable risk to safety".

In June, NHTSA ordered all car manufacturers to report any crashes involving fully autonomous vehicles or partially automated driver assist systems.

The measures show the agency has started to take a tougher stance on automated vehicle safety than in the past. 

It has been reluctant to issue any regulations of the new technology for fear of hampering adoption of the potentially life-saving systems.

Share this articleComments

You might also like