Tesla Beta Testers Say Full Self Driving Mode Is Dangerous
The autonomous car: science fiction technology many drivers are eagerly awaiting. Finally, headlines make it seem the future has arrived; Elon Musk has made Tesla’s Full Self Driving (FSD) suite of driver aids available to vehicle owners. There is only one catch: Tesla owners involved in the FSD beta test say the software is not ready.
Tesla Autopilot vs. Full Self Driving (FSD)
Many motorists are confused by Tesla Full Self-Driving vs. Tesla Autopilot. Autopilot is an older system Tesla classifies as “an advanced driver assistance system.” Autopilot includes an advanced lane-keep assist called “autosteer.” It also features adaptive cruise control capable of matching the speed of surrounding traffic or the speed limit. However, Autopilot only works on highways while a human driver keeps their hands on the wheel. It is also capable of parking itself.
Tesla Full Self Driving (FSD) mode expands the Autopilot’s range to include city streets with traffic, intersections, and pedestrians. An important distinction: Because FSD requires a human driver to be present and ready to intervene, industry and safety regulators do not categorize it as a true “self-driving” software.
On October 7th, 2021, Elon Musk revealed how to get a Full Self Driving Tesla. First, buyers must pay an extra $10,000 for a Tesla outfitted with required cameras and sensors. Then they must request Full Self Driving Beta. This request allows their car to rate their driving safety for seven days. After that, drivers with the best safety ratings are permitted to access FSD Beta.
Tesla owners must demonstrate safe driving before accessing Full Self Driving Beta because the system is still learning to drive. When FSD is not active, it gathers data from the way the human operator drives. Musk does not want his software learning from unsafe drivers. But Tesla owners with early access to FSD Beta have said the technology is unsafe and currently unsuited for wide rollout.
Owners say the Full Self Driving Tesla is unsafe
Since the winter of 2021, a select few Tesla owners have participated in the Full Self Driving systems’ Beta tests. They have found that Full Self Driving Beta is surprisingly sketchy.
In the above video, a Tesla owner named Stephen Pallotta is testing Full Self Driving in his car. On September 23rd, he lets the vehicle navigate a right-hand turn on its own but is shocked to find it cross over a double yellow line into oncoming traffic. Luckily, Pallotta was able to take control back from the self-driving Tesla to swerve out of the way of the oncoming traffic. he concluded that “wide release is not the move, not right now.”
In the second video, a Tesla owner is letting Full Self Driving navigate a city street. During the September 15th drive, the car attempts to turn right, through a crosswalk, despite multiple pedestrians in its path. Again, the human driver regained control and apologized to the pedestrians the car nearly struck.
Has Tesla software caused any crashes?
Tesla’s on Autopilot have already struck about a dozen emergency vehicles parked on the highway breakdown lane. This includes when a Tesla Model 3 on Autopilot struck a Florida Highway Patrol car. The malfunction is so severe that the NHTSA is opening an investigation.
Consumer Reports points out that while owners in the beta program are knowingly testing FSD in the real world for Tesla, nearby motorists, pedestrians, and cyclists are unknowingly part of a potentially dangerous beta test. CR’s Auto Test Center senior director warned, “A minor failure can become catastrophic.”
What does Tesla’s founding CEO think of the complaints about Full Self Driving mode? Shockingly, Elon Musk admitted, “FSD Beta 9.2 is actually not great imo.”