- Tesla must provide National Highway Traffic and Safety Administration with extensive data about its driver assistance system, marketed as Autopilot, by October 22, 2021.
- NHTSA is trying to determine whether Tesla's Autopilot has a safety defect that caused Tesla vehicles to hit first-responder vehicles.
- The agency has the authority to mandate a recall if it determines a car, or any part or system within it, has a safety defect.
In this article
A 2019 Tesla Model 3 hit a Florida Highway Patrol car in Orlando on the morning of August 28, 2021. No injuries were reported.Courtesy: Florida Highway Patrol
The National Highway Traffic and Safety Administration has added a 12th crash into the scope of its investigation into Tesla's Autopilot system, and is demanding that the company provide an exhaustive amount of data about its driver assistance systems by Oct. 22.
Autopilot is Tesla's driver assistance system that comes standard with all of its newer models. Tesla also sells a more advanced version under the brand name "Full Self Driving," for $10,000, or to subscribers for $199 a month in the U.S. Its Autopilot and FSD offerings do not make Tesla vehicles safe for operation without a driver at the wheel — the systems can control some aspects of the car, but "active driver supervision" is required, according to Tesla's website.
As CNBC previously reported, NHTSA's office of defects investigation kicked off a safety probe in August after the agency determined that Autopilot was in use before collisions between Tesla electric cars and first responder vehicles. Those prior crashes were responsible for 17 injuries and one fatality.
A more recent crash in Orlando, Florida, involving a Tesla Model 3 and a police car, is now part of the investigation. The Tesla driver in that incident narrowly missed a trooper, and told officers she was using the car's Autopilot feature at the time of the collision.