Tesla 'forced itself' into collision with another car, driver in self-driving trial crash claims

According to the driver, the Tesla alerted themto the other vehicle part-way through the turn into the other lane.
According to the driver, the Tesla alerted themto the other vehicle part-way through the turn into the other lane. Copyright Priscilla Dupreez / Unsplash
Copyright Priscilla Dupreez / Unsplash
By Tom BatemanAP
Share this articleComments
Share this articleClose Button

A Tesla driver taking part in the beta test of the company's driver assist technology claimed their car 'forced itself' into a different lane, causing an accident earlier this month.

ADVERTISEMENT

A Tesla driver in the United States has claimed that their vehicle's "Full Self-Driving" system caused a crash earlier this month, prompting an investigation by traffic safety authorities.

The driver, who was taking part in the feature's beta test, said their Tesla Model Y strayed into the wrong lane and was hit by another vehicle, according to a complaint filed with the US National Highway Traffic Safety Administration (NHTSA).

"The car went into the wrong lane and I was hit by another driver in the lane next to my lane," the driver wrote.

According to the driver's complaint, the Tesla alerted them to the other vehicle part-way through the turn into the other lane.

However, when the driver tried to turn the wheel to avoid a collision, the Tesla "forced itself into the incorrect lane, creating an unsafe manoeuvre putting everyone involved at risk," they said.

Beta testing

An NHTSA spokesperson told the Associated Press that the agency was aware of the crash, which took place on November 3, and was communicating with Tesla to learn more about the incident.

Tesla, which has disbanded its press office, did not respond to requests for comment.

Laguna Beach Police Department/AP
Tesla was already under investigation for a series of accidents involving the compaany's autopilot and emergency services vehiclesLaguna Beach Police Department/AP

The company began rolling out a new beta version of its FSD driver assist system in September, initially offering the software to Tesla owners with a high "safety score" via a button in the company's app.

The beta test ran into issues last month when Tesla had to roll back the release of FSD 10.3 after detecting safety issues in the software.

"Regression in some left turns at traffic lights found by internal QA in 10.3. Fix in work, probably releasing tomorrow," Tesla CEO Elon Musk tweeted at the time.

Critics of the move have said offering beta versions of self-driving software to untrained drivers is dangerous, as the software could contain flaws that put them and other road users at risk.

Beta testing is a field test of software done by users before the full commercial release is ready.

Tesla is already under investigation by the NHTSA after a series of crashes involving its cars and stationary emergency services vehicles in the US.

Share this articleComments

You might also like