Tesla's "Pure Vision" self-driving system, which removes radar and relies solely on cameras, has long been considered an outlier in the industry. Former Waymo CEO John Krafcik fiercely criticized Tesla's design as being like "severe myopia," while OpenAI CEO Sam Altman, in an earlier online feud with Elon Musk, directly stated that Tesla's self-driving system, which has caused multiple deaths, is far more dangerous than OpenAI's ChatGPT.
Former Waymo CEO: Tesla's eyesight wouldn't even pass a DMV test.
Compared to the "sensor fusion" strategy adopted by most car manufacturers, which involves stacking sensing elements such as LiDAR, millimeter-wave radar and ultrasonic sensors, Tesla has chosen a more extreme approach: seeing the world only with a camera.
In response, former Waymo CEO John Krafcik offered a very scathing technical commentary. He pointed out that Tesla's pure vision-based solution has serious hardware flaws. Even though Tesla claims to have improved camera resolution, Krafcik analyzed that Tesla only equips its cameras with seven 5-megapixel lenses, and the focal length configuration is limited (mostly wide-angle), resulting in an equivalent visual sharpness of only 20/60 or 20/80.
What does this mean? Normal vision is 20/20, meaning Tesla's onboard "eyes" are equivalent to severe myopia in humans. John Krafcik bluntly stated, "This visual ability wouldn't even pass a standard DMV vision test, let alone meet the safety requirements for autonomous driving." He believes that in bad weather or complex lighting conditions, Tesla, lacking radar assistance, is like being blindfolded—it's like "handcuffing AI," artificially limiting the system's ability to perceive the world.
Sam Altman adds: FSD has caused more than 50 deaths.
Beyond technical concerns, safety issues have also become a target of attack. When Elon Musk criticized ChatGPT on the X platform for inducing users to commit suicide, OpenAI CEO Sam Altman immediately retaliated, pointing the finger at Tesla's Full Self-Driving (FSD) function, which he claimed had caused over 50 deaths. He sarcastically remarked, "I've only ridden in a car equipped with this feature (FSD) once, and my first reaction was that it's far from safe." These remarks undoubtedly deepened concerns about the reliability of purely vision-based solutions in extreme situations.
Don't let your loved ones use ChatGPT https://t.co/730gz9XTJ2
- Elon Musk (@elonmusk) January 20, 2026
Sometimes you complain about ChatGPT being too restrictive, and then in cases like this you claim it's too relaxed. Almost a billion people use it and some of them may be in very fragile mental states. We will continue to do our best to get this right and we feel huge… https://t.co/U6r03nsHzg
- Sam Altman (@sama) January 20, 2026
Why does Tesla insist on removing the radar?
Faced with overwhelming skepticism, Elon Musk insisted on "first principles": humans can drive with just their eyes, so AI should be able to too. He believes that conflicting data from multiple sensors can cause system indecisiveness, increasing risk. Tesla is betting on its possession of the world's largest pool of driving data, attempting to compensate for the shortcomings of hardware perception through powerful neural network algorithms.
However, market observers believe that Tesla's insistence on a full vision solution may be aimed at significantly reducing the added costs of sensing components such as LiDAR, thus keeping Tesla vehicle prices within an acceptable range in the market.
Analysis of viewpoints
Tesla's vision-only approach is a high-stakes gamble. From a software-defined vehicle perspective, reducing hardware, lowering costs, and relying on powerful AI algorithms is indeed the commercially optimal solution. However, the "physical limits" problem raised by John Krafcik cannot be ignored.
While competitors like Waymo and Chinese automakers are adopting higher-resolution LiDAR and high-pixel lenses for redundancy, Tesla is attempting to use hardware that's essentially "nearsighted" combined with a "super brain" to handle all road conditions. By 2026, advancements in AI models might compensate for some perception deficiencies, but in scenarios where "visual blindness" occurs, such as heavy rain, dense fog, or direct sunlight, the physical properties of radar waves will still hold irreplaceable advantages.
If Tesla cannot overcome the physical limitations of its 5-megapixel camera in long-distance recognition, then "pure vision" may forever remain at the level of L2+ assisted driving, unable to cross the truly unsupervised L4/L500 chasm. After all, no matter how intelligent AI is, it cannot see things that are "physically" invisible.



