Tesla faces a 90-day Ultimatum in California — "Full Self-Driving" Ruled Misleading
News > Business News
Audio By Carbonatix
4:15 PM on Wednesday, December 17
By Philip Uwaoma | Guessing Headlights
A California administrative law judge has concluded that Tesla Inc. engaged in deceptive marketing practices by misrepresenting the capabilities of its Autopilot and Full Self‑Driving (FSD) driver‑assist systems. The judge’s conclusions may soon have major consequences for the electric‑vehicle maker’s operations in the state.
The decision stems from a complaint filed in 2022 by the California Department of Motor Vehicles (DMV), which argued that Tesla’s promotional language — particularly the use of the terms “Autopilot” and “Full Self‑Driving Capability” — created an inflated impression of autonomy that was not matched by the technology’s actual performance. According to the DMV and the judge’s findings, this misrepresentation could lead consumers to over‑rely on the systems in ways that compromise safety because the vehicles still require continuous human supervision. The judge’s proposed order notes:
“ A reasonable consumer likely would believe that a vehicle with Full Self-Driving Capability can travel safely without a human driver’s constant, undivided attention. This belief is wrong — both as a technological matter and as a legal matter — which makes the name Full Self-Driving Capability misleading [and] a violation of both civil and vehicle codes in California. ”
The Case BeginsAdministrative Judge Juliet Cox’s recommendation — now adopted by the California DMV — declares that Tesla’s marketing language was misleading under state law because it implied levels of autonomous capability that the cars do not possess. In practical terms, regulators believe many reasonable consumers could mistakenly infer that a Tesla equipped with Autopilot or FSD could safely operate without human oversight, a conclusion the judge said has “dangerous” implications.
Regulators initially proposed 30‑day suspensions of Tesla’s dealer and manufacturing licenses in California. Such a drastic penalty would halt production and sales in the company’s largest U.S. market. However, both those enforcement actions have been stayed (paused) to allow Tesla time to comply:
- A 90‑day window (or 60 days in some versions of the DMV announcement) has been provided for Tesla to amend or eliminate deceptive language in its marketing.
- If Tesla fails to make the changes within that period, the DMV could enforce a 30‑day suspension of vehicle sales in the state.
- The manufacturing suspension has so far been indefinitely stayed, meaning it won’t take effect unless regulators later decide to enforce it.
At the heart of the dispute is a tension that has dogged Tesla for years: the difference between what its nameplates suggest and what its technology actually does. Although Autopilot and FSD provide advanced driver‑assistance at what’s known in technical terms as Level 2 autonomy, meaning the driver must remain alert and ready to intervene, the labels and related promotional language often convey something closer to full automation, a level of capability several steps beyond current reality.
The DMV complaint cited past marketing that included assertions such as the system being “designed to be able to conduct short and long‑distance trips with no action required by the person in the driver’s seat.” Regulators said this phrasing could reasonably be interpreted as suggesting true autonomous driving. Tesla has responded that it has always required driver supervision and that disclaimers in manuals and on‑screen alerts made this clear to customers.
This ruling doesn’t emerge in a vacuum. Tesla has faced numerous legal challenges, federal investigations, and civil lawsuits tied to Autopilot and FSD marketing, safety claims, and crashes in which the systems were engaged. Past federal class actions have alleged that Tesla’s marketing exaggerates the systems’ capabilities, and in one 2025 federal trial a jury found Tesla partly responsible for a fatal crash involving Autopilot.
Safety advocates and regulators have long warned that naming conventions like Autopilot can create a false sense of security and encourage drivers to take their attention off the road. Independent studies have shown that sophisticated Advertising names significantly increase driver over‑reliance compared with more plainly labeled systems from other automakers, a key reason regulators pressed the case.
Tesla’s ReactionThis was a “consumer protection” order about the use of the term “Autopilot” in a case where not one single customer came forward to say there’s a problem.Sales in California will continue uninterrupted. — Tesla North America (@tesla_na) December 17, 2025
Tesla has publicly characterized the DMV order as a consumer‑protection decision focused on terminology and has insisted that sales in California will continue uninterrupted so long as it complies or seeks an appeal. The company argues that no individual consumer has formally complained that the language caused harm.
Regulators, however, maintain that preventative action is needed to protect road users before incidents occur, not merely to respond after the fact. Elon Musk’s company now faces a narrow period in which to either rebrand its Autopilot/FSD terminology, adjust its marketing, or potentially contest the decision in court.
A temporary Tesla sales ban in California — even if avoided — underscores mounting friction between innovative technology firms and regulatory bodies prioritizing clarity and safety. The case could influence how companies across the autonomous and assisted‑driving industry market their technologies, ultimately shaping consumer expectations and legal risks long beyond this specific ruling.