Some Tesla Drivers Are Abusing Autopilot
Autonomous driving software is a massively convenient thing. It’s an exciting new technology that will certainly change the face of driving as we know it. Convenience is king in marketing and assisted driving technology like Autopilot, by the EV company Tesla, capitalizes on that.
Disappointingly, some Tesla owners have chosen to abuse this feature, causing accidents and sometimes even fatalities. This unfortunate trend is further sullying Tesla’s reputation, which consistently advertises the safety of the Autopilot system.
No loss is an acceptable loss
Because it is a new technology, Tesla’s Autopilot needs testing, just like all new tech. However, it would appear much of that testing is being conducted in the presence of the general, and likely unwilling, public. Simply put, the EV company’s Autopilot system is too easy to trick into thinking someone is at the wheel when no one is.
According to this article by KHOU-11, two men were killed in a Tesla while no one was driving. The story made national headlines, drawing the criticism from many that the Palo Alto-based company’s autopilot can be too easily tricked into thinking someone is at the wheel. It’s a fair criticism, too, as this is not the first time someone using Autopilot crashed a Tesla without touching the wheel.
Why would someone use Autopilot like that?
As a matter of fact, this is far from the first time something like this has happened, with several incidents cropping up over the years. So why does this keep happening? Perhaps the answer doesn’t lie with Tesla, though they certainly share some blame.
There are products that will help you defeat the systems that detect if there is a driver present while Autopilot is in use, like this one. There are some things in life that will inevitably be ruined by other people, and Autopilot systems are one of them. Just like the drunk guy at the bar in your local sports team’s jersey, ruining the game for everyone else.
What can be done to stop this?
The easiest fix to this problem is on Tesla’s end. Without more critically important countermeasures in use, crashes will continue to happen. Tesla already has cameras in-car that face the occupants. Adding a parameter that the camera must see someone in the driver’s seat is one solution. Another potential fix could require the driver to press a button, acknowledging that they are present and attentive.
Other fixes are more dependent on the human brain. Human beings are excellent at actively monitoring things. Unfortunately, the human attention span is short. Self-driving software takes away that monitoring component, letting minds wander. Of course, this only applies if someone is actually in the driver’s seat.