NHTSA Releases Final Findings on Fatal Tesla Collision

NHTSA Releases Final Findings on Fatal Tesla Collision

January 25, 2017  | Greg Rogers

January 25, 2017

On January 19, the National Highway and Traffic Safety Administration (NHTSA) Office of Defects Investigation (ODI) released the full findings of its investigation into the fatal Tesla collision in May of last year.

ODI launched the investigation in order to determine if there was a defect in the design and performance of Tesla’s Autopilot system at the time of the collision.

The highly publicized collision occurred when a Tesla Model S struck a tractor-trailer crossing its path at an uncontrolled intersection. A preliminary crash report released by the National Transportation Safety Board (NTSB) last July confirmed that Tesla’s signature Autopilot mode was engaged at the time of collision.

Autopilot is a suite of safety features that enable semi-autonomous driving, but the system requires its human operator to constantly monitor the road and be prepared to take over control of the vehicle.

Among the standard safety features of Autopilot is its automatic emergency braking (AEB) system, which is designed to prevent rear-end collisions – but not necessarily collisions with vehicles crossing its path.

Data obtained from car showed Tesla was in Autopilot at time of collision and that AEB did not warn the driver or engage the brakes before collision. Further, the driver did not brake, steer, or otherwise attempt to avoid the collision himself – suggesting he was either inattentive or unable to intervene.

For the purposes of this investigation, ODI analyzed four subjects:

  1. Automatic emergency braking system design and performance in the 2015 Tesla Model S and a peer vehicle (in this case a 2015 Mercedes C300 4Matic);
  2. Human-machine interface issues pertaining to Tesla’s Autopilot;
  3. Data from crash incidents related to Tesla’s Autopilot and AEB systems;
  4. Over-the-air software updates to Tesla’s Autopilot and AEB systems during the investigation.

Automatic Emergency Braking

After the May crash, NHTSA tested auto braking using Tesla Model S and 2015 Mercedes C300. The testing confirmed that auto braking in Tesla and peer vehicle could avoid crashes in most rear-end scenarios.

In fact, in most crash scenarios Tesla’s crash imminent braking was not needed because adaptive cruise control (ACC, a component in Autopilot) provided sufficient braking.

The AEB system used by Tesla through model year 2016 was not designed to prevent colliding with a vehicle making left turns across its path, but solely front-to-back collisions.

Since the investigation found that Tesla’s AEB system is properly designed to avoid or mitigate rear-end collisions, the case was closed on investigating a defect in this system.

Moreover, it was not specifically intended to prevent crashes like this in the first place: “braking for crossing path collisions, such as that present in the Florida fatal crash, are outside the expected performance capabilities of the system.”

To AV or not to AV

In the wake of the collision, industry leaders and consumer protection advocates have butted heads over the deployment of semi-autonomous technologies.

On the one hand, industry leaders like Elon Musk, CEO of Tesla, have argued that semi-autonomous features should be deployed as they are developed in order to realize their life-saving potential as soon as possible.

Consumer Reports, meanwhile, has called for Tesla to disable Autopilot’s hands-free operation because it exposes drivers to danger and “gives consumers a false sense of security.” The organization cited research demonstrating that drivers may take between three to 17 seconds to regain control of a semi-autonomous vehicle after being alerted that the car is reverting to human control.

In light of this, Consumer Reports claims that human drivers tend to place too much trust in Autopilot’s ability to drive autonomously, and therefore are often unprepared to take over control at a moment’s notice – potentially resulting in a collision.

In essence, many automakers are rolling out automated safety features as they become available in order to realize their life-saving potential as soon as possible – at this time, all of these features require drivers to still monitor the road and be prepared to assume control at any time.

NHTSA assessed data from Tesla Model S and X crashes involving airbag deployments when Autopilot was actively operating during the collision or within 15 seconds of the collision.

That being said, ODI found that many of the Tesla crashes it analyzed involve driver behavior factors: high speeds, mode confusion (when the driver did not understand whether Autopilot was engaged), and distraction.

However, ODI’s analysis of mode confusion issues “did not identify a pattern of failures indicating a potential design defect,” as Tesla undertook numerous steps to ensure drivers were aware of their vehicle’s limitations. Among them are:

  • The Tesla owner’s manual reads, “never depend on Autosteer to determine appropriate driving path… always be prepared to take immediate action.”
  • When Autopilot is engaged, the driver is reminded to pay attention to the road and keep their hands on the wheel.
  • Then, Autopilot monitors driver engagement constantly, providing escalating warnings to remind drivers to keep their hands on the wheel whenever they are removed.
  • Finally, with a September 2016 Tesla update, drivers can “strike out” if they do not respond to alerts – this prevents them from activating Autopilot again until they end their trip and start a new one.

ODI acknowledged Tesla’s efforts to educate consumers, stating “it is important that operators recognize this responsibility and understand the capabilities and limitations of the system.”

What it means

Ultimately, the investigation did not identify any defects in the design or performance of the automated functions of Autopilot or its human-machine interface.

This decision is tremendously important to the deployment of automated features in vehicles across the coming years. The investigation went well beyond the scope of purely mechanical flaws to understand driver behavior and interactions with automated technology – fully addressing concerns surrounding driver attention.

Among the most important findings: deploying semi-autonomous technologies can present particular hazards when drivers are not paying attention, but overall the deployment of Tesla’s automated features has reduced Tesla vehicle crash rates by almost 40%.

Furthermore, as Danielle Muoio of Business Insider notes, this may mark the beginning of structural changes in NHTSA’s recall structure. In the course of the investigation, Tesla issued an over-the-air software update that alleviated some of the agency’s concerns about Autopilot.

While it would require a thorough review and revamp by USDOT and NHTSA under Secretary-designate Elaine Chao, it is possible that automakers may one day be able to address defects after recalls are issued by circulating over-the-air updates.

Until then, it remains to be seen whether the recall of the future will involve driving a car into a dealership – or even the car driving itself there.

Share

Related Articles

USDOT Announces Recipients of $60M in ADS Grants

USDOT Announces Recipients of $60M in ADS Grants

On Wednesday, September 18, USDOT announced the recipients of $60 million in Automated Driving Systems (ADS) grants. The funds were...

Op-Ed: The Nexus of AVs, Bikes, and Pedestrians: Similar Problems, Similar Solutions

Op-Ed: The Nexus of AVs, Bikes, and Pedestrians: Similar Problems, Similar Solutions

June 19, 2019 As we near the end of this decade, the reality of autonomous vehicle (AV) has never felt more tangible. Low levels of...

Op-Ed: The Nexus of AVs, Bikes, and Pedestrians: Similar Problems, Similar Solutions

Op-Ed: The Nexus of AVs, Bikes, and Pedestrians: Similar Problems, Similar Solutions

June 19, 2019 As we near the end of this decade, the reality of autonomous vehicle (AV) has never felt more tangible. Low levels of...

NHTSA Asks for Comments on Automated Driving Rule

NHTSA Asks for Comments on Automated Driving Rule

On May 28, the National Highway Traffic Safety Administration (NHTSA) released an advance notice of proposed rulemaking (ANPRM) soliciting...

Be Part of the Conversation
Sign up to receive news, events, publications, and course notifications.
No thanks