Tesla’s in hot water with the U.S. Highway Safety Agency for telling the public its cars can drive themselves, even though manuals and private discussions with the agency clearly state they need human supervision. The National Highway Traffic Safety Administration (NHTSA) asked Tesla to rethink its messaging to ensure it lines up with what’s in the user manuals.
The request came via email from Gregory Magno, a division chief at NHTSA, as part of an investigation into Tesla’s “Full Self-Driving” (FSD) feature. According to the Associated Press, this probe, which started after a series of crashes in poor visibility conditions like fog, dust, and glare, raised questions about the safety of Tesla’s partially autonomous tech.
Elon Musk has previously promised fully autonomous Teslas, even claiming that by next year, the company will have driverless Model Ys and 3s on the road. Tesla's aiming to roll out robotaxis without steering wheels by 2026 in California and Texas, but this latest investigation throws a bit of shade on whether they’re actually ready to deliver full self-driving tech.
Critics, including U.S. Transportation Secretary Pete Buttigieg, have long argued that names like “Full Self-Driving” and “Autopilot” mislead users into thinking these systems are fully autonomous when they’re actually not.
A series of Tesla posts on X adds fuel to the fire. According to Magno, Tesla’s account has reposted content that shows drivers using FSD hands-free, which could give the impression it’s a self-driving system when, in reality, human oversight is required. In one example, Tesla reposted a story about a man using FSD to get to the ER during a heart attack, which could imply it’s safe to rely on FSD without much intervention.
Tesla’s website mentions that using FSD and Autopilot without human supervision requires more reliability and regulatory approval. Still, a video shows a man driving with his hands off the wheel, claiming he’s only there for legal reasons. This contradicts Tesla’s instructions to stay alert.
The NHTSA’s investigation will look into whether FSD gives drivers enough real-time feedback to safely handle situations like low visibility. Tesla has until December 18 to respond to the agency's request for details on what alerts the system provides in low-visibility scenarios.
The investigation probably won’t be finished before Trump takes office in January, and with Elon Musk spending over $100 million to support Trump’s campaign, safety advocates are concerned. They worry that if Musk has any influence over the agency, ongoing probes into Tesla’s self-driving tech could get sidelined.
8 Comments - Add comment