Skip to main content

Tesla Autopilot has been a hot topic of controversy lately. There have been a bunch of crashes in which drivers misuse the poorly-named Tesla “self-driving” feature. However, the recent fatal Tesla crash in Texas that claimed two lives sparked a more heated conversation around the issue of Tesla’s liability in all this. How responsible is Tesla, really? 

The black front seats and wood-trimmed dashboard of a 2018 Tesla Model 3 showing the large screen that is used to control the Tesla Autopilot software
2018 Tesla Model 3 interior | Tesla

The Tesla Autopilot is a poorly named feature

At this point, Tesla has been more than clear about how to use, and more importantly, not use the Tesla Autopilot software. Yet drivers continue to try to force the cars to drive themselves without any human oversight. This is a mistake. 

According to New Atlas, Tesla is leading the market in regards to “autonomous” driving. Part of the downside of all that glory and pioneering is being the one to mess up, too. As we’ve come to expect from Tesla, the company has been making many bold promises on “autonomous” cars and has yet to really deliver. 

How successful is the Tesla Autopilot software?

Admittedly, Tesla has pushed the boundaries and made some incredible strides, but the speed at which this tech is rolling out is what has many safety experts concerned. The Texas crash was a strong and tragic reminder that these new technologies come with certain risks and must be treated with respect. 

As New Atlas keenly points out, this fatal crash came on the heels of Elon Musk Tweeting about Tesla’s Q1 safety reports. Teslas can record tons of information from its customers’ cars. It has been using this data to compare Autopilot and actual human drivers. 

The data suggested that human drivers were 10x more likely to have a crash than Autopilot is. Soon after Tweeting those findings, The Texas crash occurred. Data isn’t always the clearest way to find the truth.

There is much more information needed to know exactly how useful that statistic is. However accurate that data is/was, this crash resonated with people and reminded us of another major Tesla Autopilot crash from 2016. 

How safe is Tesla’s Autopilot?

As the Senate announced it would be looking into the safety of these systems and possible corrective measures against Tesla, Consumer Reports showed how easy it is to “fool” the Autopilot system. The system is designed to only work if a human is ready to take control quickly. Tesla does this by using torque sensors in the steering wheel to detect a human grip. 

2019 Tesla Model S crashed into a tree after police discovered the Tesla Autopilot was on with no one in the driverseat
2019 Tesla Model S crashed scene | KTRK-TV – ABC13

As Consumer Reports demonstrated on a closed track, this torque sensor can be fooled simply by hanging a small weight off the wheel. Without any other way to verify a present driver, the car will continue to drive even if you slide over into the passenger seat. This test pretty much slammed the book shut on how easy it is to force a Tesla to “self-drive” without an attentive human presence. 

So, was this crash Tesla’s fault? 

It’s hard to take agency away from the drivers in this case. Elon has been firing back with counters to the reporting that the Autopilot system is easily tricked. His Twitter feed has him citing fans and Tesla lovers reciting the various safety measures Tesla takes to prevent these “driverless” crashes. 

So, is Tesla, in part, responsible? That feels like more of a philosophical question at this point. There is a strong point to be made about the seductive nature of this emerging technology and our attraction to it. That, along with the arguably misleading name of “Autopilot,” may send some side-eyed looks Tesla’s way, but there is another, murkier side to all of this. 

It’s tough to hold a company responsible for the intentional misuse of a product. Since Tesla has been very clear that drivers shouldn’t misuse this software, that makes it hard to blame Elon and company for someone’s poor judgment and personal choices. 

That being said, Tesla should probably spend a little more time making the Autopilot system much harder to fool. For the future acceptance of this new tech and the safety and well-being of the public motorways, Tesla still has a lot of work to make this tech a bit more fool-proof.

Related

Tesla Model 3 Owner Tortures Thieves With Tesla App Until They Give up