News

Tesla’s ‘Autopilot’ system is anything but, safety experts say

The technology has been controversial since it was first introduced. Early owners frequently posted about the system on social media, one owner going so far as to record himself on video climbing into the backseat while his vehicle operated hands-free. Tesla CEO Elon Musk himself was photographed driving with his then-wife, hands out the window of his Model S, apparently to show off its capabilities.

But Tesla backed off from what skeptics called an often breathless promotion of Autopilot’s capabilities after a May 2016 crash in Florida that took the life of 40-year-old Joshua Brown. The former Navy SEAL’s vehicle crashed into the side of a truck that had turned in front of it. The National Highway Traffic Safety Administration initially put the blame on Brown for failing to oversee the vehicle’s operation. But the National Transportation Safety Board’s separate investigation also faulted Autopilot for failing to distinguish between the white truck and a bright Florida sky.

Tesla subsequently modified both the hardware and software used to control the semi-autonomous system while also putting more emphasis on the need for a motorist to remain vigilant and be ready to regain control at a moment’s notice.

Tesla’s ‘Autopilot’ system is anything but, safety experts say
Tesla’s ‘Autopilot’ system is anything but, safety experts say

The automaker took pains following the fatal California Model X crash to note that the driver repeatedly failed to heed alerts telling him to retake control from the Autopilot system before it hit the freeway barrier.

Tesla is now dismissing the complaint by the CAS and Consumer Watchdog; a statement from the company read, “The feedback that we get from our customers shows that they have a very clear understanding of what Autopilot is, how to properly use it, and what features it consists of.”

Image: A Tesla Model S is seen after it hit the back of a mechanic truck in South Jordan,
The Tesla Model S sedan involved in a traffic collision with a Fire Department mechanic truck stopped at a red light in South Jordan, Utah on May 11, 2018.South Jordan Police Department / via Reuters

An e-mail from a Tesla spokesperson notes that the company’s website “makes clear that although every vehicle has the hardware necessary for Full Self-Driving, actual ‘self-driving functionality is dependent on extensive software validation and regulatory approval.’”

That said, CEO Musk has seldom hidden his enthusiasm for Autopilot, suggesting during a recent earnings conference call that full self-driving functionality could be ready for activation by next year.

That could, of course, be hampered by the results of the investigation into the March fatal crash, as well as the Utah collision and several other Tesla incidents that might be linked to Autopilot.

Meanwhile, there appears to be growing public concern about self-driving technology, in general. The AAA this week released a new, random study of more than 1,000 U.S. motorists and it found 73 percent would be “afraid” to ride in a fully self-driving vehicle. Even while walking or riding a bike, 63 percent of those surveyed said they would feel less safe knowing they must share the road with self-driving vehicles.

Wednesday’s complaint to the FTC marks just the latest salvo in a battle over the name, Autopilot. In 2016, German regulators also questioned whether the system should be renamed to avoid consumer confusion. Tesla challenged that, however, and continues to call the system Autopilot there.

Similar Posts