Autopilot Feature Scrutinized After Three Die in Recent Tesla-Related Crashes

Three crashes involving Teslas have killed three people in recent weeks, raising questions over the safety of the electric cars’ Autopilot feature, just months before the company plans to put self-driving cars on the streets.
Sputnik

On Sunday, a Tesla Model S ran a red light on a California freeway in Los Angeles, crashing into a Honda Civic and killing two people inside the Civic, the US National Highway Traffic Safety Administration (NHTSA) confirmed. On the same day, a Tesla Model 3 crashed into a parked fire truck on an Indiana freeway, resulting in the death of a passenger in the Tesla. On December 7, a Tesla Model 3 collided with a police cruiser on a Connecticut freeway. However, no injuries occurred as a result of that incident. 

According to a report by CBS, the NHTSA will investigate the California crash but has not decided whether it will investigate the Indiana crash. In both of those incidents, it is unclear if Tesla’s Autopilot mode was activated. 

In the Connecticut collision, which will also be investigated by the NHTSA, the driver in the Tesla told police the car was in Autopilot mode when the crash occurred. 

Tesla’s Autopilot system allows cars to self-park, change lanes and “navigate autonomously in certain conditions,” according to Tesla’s website.

Tesla has repeatedly stated that its Autopilot system is designed to assist and not to replace drivers. On its website, the company, which was founded by Elon Musk, writes, “Autopilot enables your car to steer, accelerate and brake automatically within its lane. Current Autopilot features require active driver supervision and do not make the vehicle autonomous.” According to a report by the Verge, Tesla is expected to roll out autonomous cars for the general public by mid-2020. However, Musk has also said that the car isn’t expected to be fully self-driving until the end of 2020 at the earliest.

However, the recent crashes have increased concern over the semi-autonomous driving feature.

“At some point, the question becomes: How much evidence is needed to determine that the way this technology is being used is unsafe?” Jason Levine, executive director of the nonprofit Center for Auto Safety in Washington, is quoted as saying by the Associated Press. “In this instance, hopefully these tragedies will not be in vain and will lead to something more than an investigation by NHTSA.”

The NHTSA is currently investigating at least 13 crashes involving Teslas dating back to at least 2016. However, the agency has not established any regulations related to the Autopilot feature.

Despite the Autopilot concerns, Tesla on Friday revealed that it delivered a record 112,000 vehicles during the fourth quarter of 2019. In addition, CNBC reports that shares of the company have increased in price by 49% over the last 12 months.

​In November, Tesla’s “Smart Summon” feature was also scrutinized after a video went viral, showing a driverless Tesla Model 3 car going down the wrong side of the road in a shopping center parking lot in Richmond, British Columbia, Canada's westernmost province. 

The car’s Smart Summon feature was activated during the incident. The feature, which was launched in September, allows a Tesla owner to literally “summon” their car using the Tesla app on their phone. The car will drive itself to the driver from a maximum distance of 200 feet.

Discuss