Tesla hits Musk’s threshold for ‘safe unsupervised’ driving
- by The Verge
- May 04, 2026
- 0 Comments
- 0 Likes Flag 0 Of 5
Transportation
Posts from this topic will be added to your daily email digest and your homepage feed.
Follow See All by Andrew J. Hawkins
is transportation editor with 10+ years of experience who covers EVs, public transportation, and aviation. His work has appeared in The New York Daily News and City & State.
We’ve crossed yet another one of Elon Musk’s self-driving thresholds. Tesla’s fleet of vehicles using the company’s Full Self-Driving (Supervised) system has driven over 10 billion miles, according to the company’s updated safety page. That means the company has crossed the line Musk set earlier this year for “safe unsupervised” driving.
But Tesla owners did not suddenly wake up today to find their FSD (Supervised) vehicles transformed into FSD (Unsupervised) ones. FSD is still just a Level 2 system that requires a fully attentive human driver behind the wheel to monitor the road and be prepared to take over at any moment.
In January, Musk said on X that “roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving” — the implication being that once Tesla reached that milestone, the company would flip the switch and all its customers would suddenly have access to an unsupervised driving system.
In January, Musk said on X that “roughly 10 billion miles of training data is needed to achieve safe unsupervised self-driving.”
Of course, that would have been an enormously risky move by Tesla, especially when there are still so many questions about the company’s willingness to accept legal responsibility for over a million vehicles with FSD. When a Waymo vehicle is responsible for a crash, Waymo assumes liability because it owns the tech and the fleet. But Tesla’s terms of service put the liability on the owner, based mostly on its characterization of FSD as a Level 2 supervised system. What happens when FSD goes unsupervised? Who assumes responsibility for a crash then?
It’s not clear that Tesla has figured that out yet. Over the years, there have been hundreds of crashes involving Tesla’s partially autonomous features and dozens of fatalities. But the company has been able to avoid liability, either by settling with victims or convincing courts to dismiss the lawsuits. On its website, Tesla maintains that FSD (Supervised) “requires active driver supervision and does not make the vehicle autonomous.”
Please first to comment
Related Post
Stay Connected
Tweets by elonmuskTo get the latest tweets please make sure you are logged in on X on this browser.
Energy





