The National Highway Traffic Safety Administration (NHTSA) has initiated an investigation into Tesla, examining 2.4 million vehicles equipped with Full Self-Driving (FSD) software following a series of crashes in low visibility conditions, including one fatal incident.
NHTSA's Investigation
NHTSA is focusing on Tesla models: 2016-2024 Model S and X, 2017-2024 Model 3, 2020-2024 Model Y, and 2023-2024 Cybertruck. The investigation seeks to assess how well the FSD software detects and responds to reduced visibility scenarios. If significant safety risks are confirmed, a vehicle recall could be required.
Tesla's FSD Limitations
The safety of FSD is questioned due to its reliance on correct visual interpretation of road conditions. Jeff Schuster, VP at GlobalData, noted that weather conditions may affect camera capabilities. Tesla has previously stated that FSD requires active driver supervision and does not make vehicles fully autonomous.
Tesla's Past Issues with NHTSA
Tesla has faced scrutiny over its autonomous technology before. An incident in April 2023 in Seattle involving FSD led to the death of a motorcyclist, raising further concerns about the technology's safety. Earlier in December 2023, Tesla voluntarily recalled over 2 million vehicles to enhance its driver assistance system.
These investigations and past incidents raise significant questions about the safety and reliability of autonomous driving technology. The outcome of the current investigation may influence the future development and regulation of autonomous vehicles.