Tesla's Self-Driving Cars Deemed 'Legally Blind' in California

Founder of The Dawn Project says Tesla's Full Self-Driving system is not safe enough to be on the roads

Apr. 20, 2026 at 10:34am

A highly detailed, glowing 3D macro illustration of a Tesla autonomous vehicle sensor array, with neon cyan and magenta lights illuminating the intricate hardware components, conceptually representing the complex digital infrastructure powering self-driving technology.Concerns over the safety and capabilities of Tesla's self-driving technology raise questions about the pace of autonomous vehicle development.Austin Today

Dan O'Dowd, the founder of The Dawn Project, has raised concerns about the safety of Tesla's Full Self-Driving (FSD) system, stating that the company's HW3-equipped cars are considered 'legally blind' by California's DMV standards. O'Dowd claims the system struggles to understand basic road signs and traffic laws, and has been involved in numerous crashes and fatalities compared to other autonomous driving companies like Waymo.

Why it matters

Tesla's FSD system is a key selling point for the company, but O'Dowd's claims raise serious questions about its safety and readiness for public roads. This could impact consumer trust in Tesla's autonomous driving capabilities and lead to increased regulatory scrutiny from agencies like the NHTSA.

The details

According to O'Dowd, Tesla's HW3 system only has 20/60 vision, meaning it can only clearly see objects from 20 feet away that a normal person could see from 60 feet. This falls short of California's DMV requirement of 20/40 vision to legally drive. O'Dowd also said the system failed a standard driving test, struggling to recognize road signs and follow traffic laws. He cited 59 deaths and over 3,000 crashes involving Tesla's FSD system, which he says is more than Waymo for the same number of miles driven.

  • In April 2026, Dan O'Dowd shared his concerns about Tesla's Full Self-Driving system.

The players

Dan O'Dowd

The founder of The Dawn Project, an organization that has raised concerns about the safety of Tesla's autonomous driving technology.

Elon Musk

The CEO of Tesla, who has made claims about the company's ability to achieve unsupervised autonomy and deploy robotaxis across the United States by the end of 2025.

Waymo

An autonomous driving company that O'Dowd says has proven self-driving cars can work safely, with one critical disengagement every 17,000 miles.

Got photos? Submit your photos here. ›

What they’re saying

“The law says you have to have 20/40 vision to drive a car. So it's legally blind by the California Department of Motor Vehicles definition.”

— Dan O'Dowd, Founder, The Dawn Project

“It doesn't know what a road closed sign means. It doesn't understand. It can't read. It literally cannot read.”

— Dan O'Dowd, Founder, The Dawn Project

“There have been 59 deaths that we know about in the U.S., over 3000 crashes. They are way more than Waymo for the same number of miles.”

— Dan O'Dowd, Founder, The Dawn Project

What’s next

The NHTSA is investigating Tesla's self-driving technology, and O'Dowd suggests the agency should disable the self-driving software until Tesla can pass thorough tests for long distances and unusual circumstances.

The takeaway

This case highlights the ongoing safety concerns around Tesla's autonomous driving technology, which appears to fall short of industry standards and legal requirements in California. It raises questions about the company's claims and the need for stricter regulation and testing to ensure self-driving cars are truly safe for public roads.