NTSB: Fatal Crash Involving Tesla Autopilot Resulted from Driver Errors, Overreliance on Automation
On May 7, 2016, Joshua Brown was driving his Tesla, which featured a self-driving function, when he slammed under a tractor-trailer crossing his lane of traffic near Williston, Fla. Brown had set the speed on the Tesla Autopilot at 74 mph and neither the automated system nor Brown attempted to brake before the crash.
According to reports, Brown had not attempted to control the car for at least two minutes before the crash, and only had his hands on the wheel of the car, a Tesla Model S, for 25 seconds out of 37 minutes that the car was on autopilot.
In its report, the National Transportation Safety Board (NTSB) determined the operational design of the Tesla’s vehicle automation permitted Brown’s overreliance on the automation, noting its design “allowed prolonged disengagement from the driving task and enabled the driver to use it in ways inconsistent with manufacturer guidance and warnings.”
As a result of its investigation the NTSB issued seven new safety recommendations and reiterated two previously issued safety recommendations.
“While automation in highway transportation has the potential to save tens of thousands of lives, until that potential is fully realized, people still need to safely drive their vehicles,” said NTSB Chairman Robert L. Sumwalt III. “Smart people around the world are hard at work to automate driving, but systems available to consumers today, like Tesla’s ‘Autopilot’ system, are designed to assist drivers with specific tasks in limited environments.”
These systems, he added, require the driver “to pay attention all the time and to be able to take over immediately when something goes wrong. System safeguards, that should have prevented the Tesla’s driver from using the car’s automation system on certain roadways, were lacking and the combined effects of human error and the lack of sufficient system safeguards resulted in a fatal collision that should not have happened.”
Findings in the NTSB’s Report
The NTSB determined the Tesla’s automated vehicle control system was not designed to, and could not, identify the truck crossing the Tesla’s path or recognize the impending crash. Therefore, the system did not slow the car, the forward collision warning system did not provide an alert and the automatic emergency braking did not activate.
Brown’s pattern of use of the Autopilot system indicated an over-reliance on the automation and a lack of understanding of the system limitations. “If automated vehicle control systems do not automatically restrict their own operation to conditions for which they were designed and are appropriate, the risk of driver misuse remains,” the report noted.
The way in which the Tesla “Autopilot” system monitored and responded to the driver’s interaction with the steering wheel was not an effective method of ensuring driver engagement, the agency added. Brown’s Tesla warned him seven times to place his hands on the wheel before the fatal crash; warnings that Brown allegedly ignored.
Tesla made design changes to its Autopilot system following the crash. The change reduced the period of time before the “Autopilot” system issues a warning/alert when the driver’s hands are off the steering wheel. The change also added a preferred road constraint to the alert timing sequence.
Fatigue, Mechanical Failure Ruled Out
Fatigue, highway design and mechanical system failures were not factors in the crash, according to the NTSB report. There was no evidence indicating the truck driver was distracted by cell phone use. While evidence revealed Brown was not attentive to the driving task, investigators could not determine from available evidence the reason for his inattention.
Although the results of post-crash drug testing established that the truck driver had used marijuana before the crash, his level of impairment, if any, at the time of the crash could not be determined from the available evidence.
The NTSB issued a total of seven safety recommendations based upon its findings, with one recommendation issued to the U.S. Department of Transportation (DOT), three to the National Highway Traffic Safety Administration, two to the manufacturers of vehicles equipped with Level 2 vehicle automation systems and one each to the Alliance of Automobile Manufacturers and Global Automakers. The safety recommendations address the need for:
- Event data to be captured and available in standard formats on new vehicles equipped with automated vehicle control systems;
- Manufacturers to incorporate system safeguards to limit the use of automated control systems to conditions for which they are designed and for there to be a method to verify those safeguards;
- Development of applications to more effectively sense a driver’s level of engagement and alert when engagement is lacking;
- Need for manufacturers to report incidents, crashes and exposure numbers involving vehicles equipped with automated vehicle control systems.
The board reiterated two safety recommendations issued to the National Highway Traffic Safety Administration (NHTSA) in 2013, dealing with minimum performance standards for connected vehicle technology for all highway vehicles and the need to require installation of the technology, once developed, on all newly manufactured highway vehicles. In January, NHTSA released a report indicating the crash was not the result of a defect in the Tesla Autopilot.
Automated Driving Systems (ADS): A Vision for Safety 2.0
The U.S. DOT and NHTSA on Sept. 12 released Automated Driving Systems (ADS): A Vision for Safety 2.0, a new federal guidance for automated driving systems to industry and states.
“The new guidance supports further development of this important new technology, which has the potential to change the way we travel and how we deliver goods and services,” said U.S. Transportation Secretary Elaine L. Chao. “The safe deployment of automated vehicle technologies means we can look forward to a future with fewer traffic fatalities and increased mobility for all Americans.”
A Vision for Safety 2.0 is a voluntary guidance that “encourages best practices and prioritizes safety,” according to DOT. It calls for industry, state and local governments, safety and mobility advocates and the public to lay the path for the deployment of automated vehicles and technologies.
“In addition to safety, ADS technology offers important social benefits by improving access to transportation, independence and quality of life for those who cannot drive because of illness, advanced age or disability,” continued Chao.