
Tesla says Autopilot makes its cars safer. Crash victims say it kills
- by The Sydney Morning Herald
- Jul 06, 2021
- 0 Comments
- 0 Likes Flag 0 Of 5

July 7, 2021 — 12.00am
Save ’Computers don’t check their Instagram’
Autopilot is not an autonomous driving system. Rather, it is a suite of software, cameras and sensors intended to assist drivers and prevent accidents by taking over many aspects of driving a car — even the changing of lanes. Tesla executives have claimed that handing off these functions to computers will make driving safer because human drivers are prone to mistakes and distractions, and cause most of the roughly 40,000 traffic fatalities that occur each year in the United States.
“Computers don’t check their Instagram” while driving, Tesla’s director of artificial intelligence, Andrej Karpathy, said last month in an online workshop on autonomous driving.
While Autopilot is in control, drivers can relax but are not supposed to tune out. Instead, they’re supposed to keep their hands on the steering wheel and eyes on the road, ready to take over in case the system becomes confused or fails to recognise objects or dangerous traffic scenario.
But with little to do other than look straight ahead, some drivers seem unable to resist the temptation to let their attention wander while Autopilot is on. Videos have been posted on Twitter and elsewhere showing drivers reading or sleeping while at the wheel of Teslas.
The company has often faulted drivers of its cars, blaming them in some cases for failing to keep their hands on the steering wheel and eyes on the road while using Autopilot.
But the National Transportation Safety Board, which has completed investigations into accidents involving Autopilot, has said the system lacks safeguards to prevent misuse and does not effectively monitor drivers.
Similar systems offered by General Motors, Ford Motor and other automakers use cameras to track a driver’s eyes and issue warnings when they look away from the road. After a few warnings, GM’s Super Cruise system shuts down and requires the driver to take control.
Autopilot does not track drivers’ eyes and monitors only if their hands are on the steering wheel. The system sometimes continues operating even if drivers have their hands on the steering wheel for only a few seconds at a time.
“This monitoring system is fundamentally weak because it’s easy to cheat and doesn’t monitor very consistently,” said Raj Rajkumar, a professor at Carnegie Mellon University who focuses on autonomous driving technology.
The National Highway Traffic Safety Administration has not forced Tesla to change or disable Autopilot, but in June it said it would require all automakers to report accidents involving such systems.
Several lawsuits have been filed against Tesla just this year, including one in April in Florida state court that concerns a 2019 crash in Key Largo. A Tesla Model S with Autopilot on failed to stop at a T intersection and crashed into a Chevrolet Tahoe parked on a shoulder, killing Naibel Leon, 22. Another suit was filed in California in May by Darel Kyle, 55, who suffered serious spinal injuries when a Tesla under Autopilot control rear-ended the van he was driving.
The crash that killed Jovani Maldonado is a rare case when video and data from the Tesla car have become available. The Maldonados’ lawyer, Benjamin Swanson, obtained them from Tesla and shared both with The New York Times.
Benjamin Maldonado and his wife, Adriana Garcia, filed their suit in Alameda County Superior Court. Their complaint asserts that Autopilot contains defects and failed to react to traffic conditions. The suit also names as defendants the driver of the Tesla, Romeo Lagman Yalung of Newark, California, and his wife, Vilma, who owns the car and was in the front passenger seat.
Yalung and his lawyer did not respond to requests for comment. He and his wife, who were not reported injured in the accident, have not yet addressed the Maldonado family’s complaint in court.
In court filings, Tesla has not yet responded to the allegation that Autopilot malfunctioned or is flawed. In emails to Swanson’s firm that have been filed as exhibits in court, a Tesla lawyer, Ryan McCarthy, said the driver, not Tesla, bore responsibility.
A still image from dashcam video provided by Benjamin Swanson shows a truck driven by Benjamin Maldonado on a California freeway seconds before it was struck by a Tesla Model 3 that was traveling about 60 miles per hour on Autopilot. Swanson is an attorney for the family of 15-year-old Jovani Maldonado, who was killed in the crash.
Credit:
Please first to comment
Related Post
Stay Connected
Tweets by elonmuskTo get the latest tweets please make sure you are logged in on X on this browser.