A fatal Tesla Model S crash in May was not the result of any defect in the car’s Autopilot semi-autonomous driving system, the National Highway Traffic Safety Administration concluded Thursday.
Joshua Brown died in May when his Tesla Model S drove under a tractor-trailer truck that was turning left in front of him. The six-month investigation found that Brown had set the Autopilot system to a speed of 74 miles per hour before the crash and that neither the Autopilot system nor Brown made any attempt to brake before the deadly impact.
The Autopilot system, which automatically keeps the car in its lane and can brake on its own, was designed to be used on limited access highways with on- and off-ramps, according to Tesla’s owner’s manual. It was not designed to be used in roads, such as the one on which Brown was driving, where cars or trucks can cross in front of the vehicle.
The crash was not the result of a defect, according to NHTSA.
Elon Musk, the head of Tesla, called the report “very positive.”
In its report, the agency noted the effectiveness of the Autosteer component of Autopilot. This is the feature that steers the vehicle to keep it in its lane. It can also change lanes on its own if the driver indicates the desire to do so by using the turn signal on the highway.
In cars with Autosteer enable, crash rates were reduced by 40%, according to NHTSA’s report, dropping from 1.3 crashes per million miles driven to just 0.8.
The federal agency has “learned a lot” from this investigation about how these sorts of technologies work, NHTSA spokesman Bryan Thomas said. Other automakers besides Tesla offer similar “driver assistance” systems. The federal auto safety agency will be looking more closely at how these systems are named and marketed, Thomas said, to make sure drivers aren’t being misled about the capabilities of the technology.
Tesla has faced criticism for calling its system Autopilot with some safety groups saying the name implies that it can drive the car by itself. In fact, Thomas pointed out, systems like this are designed to work only in certain situations and with a fully attentive driver at the wheel.
Tesla has said the name Autopilot is not intended to imply full self-driving capability.
The agency will also be looking at how these systems monitor drivers to make sure they’re alert and how they warn drivers of the need to take control of the vehicle when it becomes necessary, he said.
In the months following Brown’s crash, Tesla updated its system to, among other things, warn drivers more aggressively if sensors indicate they are taking their hands off the steering wheel for too long. With the newly updated software, the Autosteer system will become temporarily inoperable if the driver needs to be warned three times to keep hands on the steering wheel.