Friday, May 15, 2026 RSS

Autonomy & EV News

Independent coverage of autonomy and EVs across the US, Europe, and Asia

Cover story · Autonomy

FSD v14 is the quiet rewrite Tesla wasn't loud about

The interventions are down. The naming is the same. Underneath, almost nothing is.

YK

Yair Knijn

Founder & editor-in-chief

| May 14, 2026 4 min read |
  • Tesla
  • FSD
  • neural networks
  • AI4

The headline number Tesla cares about is miles per critical intervention. The headline number drivers care about is whether their car still phantom-brakes on an underpass shadow. With FSD v14, both are moving in the same direction for the first time in two years, and the reason has very little to do with what Tesla advertised on stage.

FSD v13 was the version Tesla talked about. It was the end-to-end neural network demo, the on-stage robotaxi reveal, the moment “Full Self-Driving” finally meant a single model from camera pixels to steering commands. v14 is the version that ships the homework. The control stack has been retrained against the long tail of last year’s intervention data. The planner has been swapped for a transformer that runs at higher temporal resolution. The inference pipeline now fits on Hardware 4 with headroom to spare, which means Hardware 5 cars are getting capabilities the older fleet structurally cannot.

That last point is the one Tesla has been reluctant to spell out, and it is the one that matters most.

The HW3 cliff is no longer theoretical

When the first end-to-end model shipped, Tesla framed Hardware 3 cars as getting “a version” of FSD that would degrade gracefully. The framing worked for about eighteen months. v14’s release notes, quietly, separate the Hardware 4 and Hardware 5 feature sets for the first time. Lane-change aggressiveness, urban-park behavior, and the new construction-zone routing are HW4+ only. HW3 owners are left with what is effectively v13 with bug fixes.

This is not a betrayal. HW3’s compute envelope has been a known constraint since 2021. What is new is that Tesla has stopped pretending the gap will be closed. The implicit promise of the original FSD purchase, that the car you bought in 2019 would, eventually, drive itself, is now an explicit two-tier product line.

What v14 actually does well

Drive it for a week and three behaviors stand out.

First, the car has learned to wait. v13 was a confident driver in the worst sense: it would commit to lane changes and unprotected lefts on what felt like a coin flip. v14 hesitates, in the way a competent human driver hesitates, long enough to confirm a gap, short enough to not annoy the car behind it. This is the kind of behavior that doesn’t show up in disengagement metrics but shows up in passenger comfort scores, and Tesla is finally optimizing for both.

Second, construction zones. The bane of every L2+ system for a decade, and the place where v13 still required a hand on the wheel that meant it. v14 reads cones, temporary striping, and human flaggers with a confidence that suggests a meaningful amount of training data was specifically about construction. It is not solved. It is, for the first time, not embarrassing.

Third, the steering wheel nag. v14 finally trusts cabin camera attention monitoring enough to let competent drivers go long stretches hands-free on the right roads. This is a regulatory call as much as a technical one, and the fact that Tesla shipped it means the legal team is now comfortable with the evidence base.

What it still does poorly

Highway merges into stopped traffic. School zones. Anything involving a cyclist behaving unpredictably. The model has clearly seen a lot of San Francisco and a lot of Texas, and it shows the moment you take it somewhere else.

And the supervision framing has not changed. v14 is still SAE Level 2. The driver is still legally responsible for every action the car takes. Tesla will keep saying this until the day it doesn’t, and that day is not v14.

The robotaxi question

Tesla has staked the company on the proposition that the same neural network that drives a customer’s Model Y will, with sufficient training data, drive a robotaxi without a steering wheel. v14 is the first version where that proposition stops looking absurd. It is also the first version where the gap between “drives well enough that you stop intervening” and “drives well enough that there is no driver” becomes legible as a number rather than a philosophical position.

That number is still very large. But for the first time, it is a number.

Comments

Talk back.

Comments are powered by GitHub Discussions via Giscus. Sign in with your GitHub account to reply. Be useful or be brief.

Elsewhere on the desk