Drivers who misuse and exploit the Tesla Autopilot system


  • A study found that Tesla drivers are distracted while using Autopilot
  • Tesla's Autopilot is a driver assistance system, not a hands-off system
  • The study notes more stringent safeguards are needed to prevent misuse

Driver assistance systems like Tesla Autopilot are designed to reduce the frequency of crashes, but drivers are more you may be distracted as they get used to them, according to a new study published Tuesday by the Insurance Institute for Highway Safety (IIHS).

Autopilot, along with Volvo's Pilot Assist system, was used in two separate studies by the IIHS and the Massachusetts Institute of Technology's AgeLab. Both studies revealed that drivers had a tendency to engage disruptive behavior while still meeting the bare minimum attention requirements of these systems, which the IIHS refers to as “partially automated systems”.

In another study, researchers analyzed how the driving behavior of 29 volunteers who were provided with a Pilot Assist-equipped 2017 Volvo S90 changed over four weeks. The researchers focused on how likely volunteers were to engage in non-driving behavior when using Pilot Assist on highways relative to unassisted highway driving.

Pilot Assist, 2017 Volvo S90

Drivers are more likely to “check their phones, eat a sandwich, or do other manual activities” than when they are driving unaided, the study found. That tendency tends to increase over time as drivers become accustomed to the systems, although both studies found that some drivers drove distracted early on.

The second study looked at the driving behavior of 14 volunteers who drove a 2020 Tesla Model 3 equipped with Autopilot over the course of a month. In this study, researchers selected people who had never used Autopilot or a similar system, and focused on how often drivers triggered the system's attention alerts.

The researchers found that Autopilot novices “quickly understood the timing of the feature to remind them of their attention so they could prevent the warnings from leading to major interventions” such as emergency landings or shutting down the system.

2024 Tesla Model 3

2024 Tesla Model 3

“In both studies, drivers changed their behavior to engage in distracting activities,” IIHS President David Harvey said in a statement. “This shows why automation systems need more strong defenses to prevent misuse.”

The IIHS announced earlier this year, in a separate data set, that assisted driving systems do not increase safety, and recommends that safety precautions be taken in the vehicle to prevent a negative impact on safety. In March 2024, it completed testing 14 driver-assistance systems across nine models and found the majority to be highly ineffective. Autopilot in particular has been found to confuse drivers into thinking it is much stronger than it really was.

Autopilot flaws have also drawn the attention of US safety regulators. In the 2023 recall Tesla limited the behavior of its Full Self-Driving Beta program, which controllers called “unreasonable risk in vehicle safety.” Tesla continues to use the misleading label Full Self-Driving Despite the system having no such capability.



Source link

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top