Tesla's Autopilot Found Most Likely to Confuse Drivers on Safety

Car assistance tech like Autopilot is misleading says study

Car assistance tech like Autopilot is misleading says study

The current iteration of Autopilot only reaches level 2 in the driving automation scale and still requires drivers to keep their hands on the wheel. 'However, unless drivers have a certain amount of knowledge and comprehension, these new features also have the potential to create new risks.' One study tested how well people understand the messages communicated by autonomous system displays, while the other measured the impact of the names used by brands on driver comprehension.

While nearly every participant was able to understand when adaptive cruise control had adjusted the vehicle speed or detected another vehicle ahead, a lot of them struggled to understand what was happening when the system didn't detect a vehicle ahead because it was initially beyond the range of detection.

Cadillac's Super Cruise provides "hands-free driver assistance". IIHS took a look at the instrument cluster of a 2017 Mercedes-Benz E-Class with the Drive Pilot system for this objective.

Just as the Euro NCAP did a while back, the IIHS chose Tesla's Autopilot as an example of a poorly named technology.

IIHS asked 2,000 drivers to answer questions about Autopilot (Tesla), Traffic Jam Assist (Audi and Acura), Super Cruise (Cadillac), Driving Assistant Plus (BMW) and ProPilot Assist (Nissan).

Nine dead after plane crash in Hawaii
Interested in Hawaii? Add Hawaii as an interest to stay up to date on the latest Hawaii news, video, and analysis from ABC News. Nine people on board the twin engine aircraft died Friday night in a crash on Oahu's North Shore, officials said.

Main photo credit: The steering wheel of a Tesla Inc. Not doing so could be very risky - in fact, some fatal crashes involving Tesla vehicles in the past happened because drivers didn't have their hands on the wheel like their cars instructed. "Six percent thought it would be OK to take a nap while using Autopilot, compared with 3 percent for the other systems". That happens when other vehicles on the road are beyond the range of detection.

One likely reason is that the names used for these features imply more than what they are actually capable of, especially Autopilot, said Harkey, adding that, "Manufacturers should consider what message the names of their systems send to people".

"Autopilot also had substantially greater proportions of people who thought it would be safe to look at scenery, read a book, talk on a cell phone or text", IIHS noted.

The names that manufacturers have chosen aren't the only problem, IIHS found when surveying motorists. Another project found that they can be seriously confused by the information these systems present on today's complex digital instrument clusters. In these cases, drivers need to be totally ready to take full control of the vehicle again and perhaps brake. "Likewise, when lane centering does not work because of a lack of lane lines, you need to steer", says Harkey.

Recommended News

We are pleased to provide this opportunity to share information, experiences and observations about what's in the news.
Some of the comments may be reprinted elsewhere in the site or in the newspaper.
Thank you for taking the time to offer your thoughts.