Tesla says that autopilot makes its cars safer. Accident victims say it kills
‘Computers don’t check your Instagram’
The autopilot is not an autonomous driving system. Rather, it is a suite of software, cameras, and sensors intended to assist drivers and prevent accidents by taking over many aspects of driving a car, including lane change. Tesla executives have claimed that transferring these functions to computers will make driving safer because human drivers are prone to mistakes and distractions, causing the majority of the roughly 40,000 traffic fatalities that occur each year in the United States.
“Computers don’t check your Instagram” while driving, Tesla’s director of artificial intelligence Andrej Karpathy said last month at an online workshop on autonomous driving.
As long as the autopilot is in control, drivers can relax, but they are not supposed to disconnect. Instead, they are supposed to keep their hands on the wheel and their eyes on the road, ready to take control in case the system becomes confused or does not recognize objects or dangerous traffic situations.
But with little to do but look ahead, some drivers seem unable to resist the temptation to let their attention wander while the autopilot is on. Videos have been posted on Twitter and elsewhere showing drivers reading or sleeping while driving Teslas.
The company has often criticized the drivers of its cars, blaming them in some cases for not keeping their hands on the wheel and their eyes on the road while using the autopilot.
But the National Transportation Safety Board, which has completed investigations into accidents involving the autopilot, has said the system lacks safeguards to prevent misuse and does not effectively monitor drivers.
Similar systems offered by General Motors, Ford Motor, and other automakers use cameras to track the driver’s eyes and issue warnings when they take their eyes off the road. After a few warnings, GM’s Super Cruise system shuts down and requires the driver to take control.
The autopilot does not track drivers’ eyes and monitors only if their hands are on the wheel. Sometimes the system continues to function even if drivers have their hands on the wheel for only a few seconds at a time.
“This monitoring system is fundamentally weak because it’s easy to fool and doesn’t monitor very consistently,” said Raj Rajkumar, a Carnegie Mellon University professor who focuses on autonomous driving technology.
The National Highway Traffic Safety Administration has not forced Tesla to change or disable the autopilot, but in June it said it would require all automakers to report accidents related to such systems.
Several lawsuits have been filed against Tesla this year alone, including one in April in Florida state court that concerns a 2019 accident in Key Largo. A Tesla Model S with autopilot on failed to stop at a T-intersection and collided with a Chevrolet Tahoe parked on a shoulder, killing 22-year-old Naibel Leon. Another lawsuit was filed in California in May by Darel Kyle, 55, who suffered serious spinal injuries. when a Tesla under autopilot control collided behind the truck he was driving.
The accident that killed Jovani Maldonado is a rare case when video and data from the Tesla car are available. The Maldonados’ attorney, Benjamin Swanson, obtained them from Tesla and shared them with The New York Times.
Benjamín Maldonado and his wife, Adriana García, filed their lawsuit in Alameda County Superior Court. Your complaint states that Autopilot contains defects and did not react to traffic conditions. The lawsuit also names as defendants the Tesla driver, Romeo Lagman Yalung of Newark, California, and his wife, Vilma, who owns the car and was in the front passenger seat.
Yalung and his attorney did not respond to requests for comment. He and his wife, who were not reported injured in the accident, have yet to address the Maldonado family’s complaint in court.
In court filings, Tesla has yet to respond to the allegation that Autopilot malfunctioned or is faulty. In emails to Swanson’s firm that were presented as evidence in court, a Tesla attorney, Ryan McCarthy, said the driver, not Tesla, was responsible.
“Police blamed the Tesla driver, not the car, for his inattention and driving at dangerous speed,” McCarthy wrote. He did not respond to emails seeking comment.
The data and video allow a detailed view of how the autopilot was operating in the seconds before the accident. Tesla vehicles constantly record short clips of cameras looking into the future. If a crash occurs, the video is automatically saved and uploaded to Tesla’s servers, a company official said in emails included in the evidence presented by Swanson.
The video saved by the car Yalung was driving shows him passing vehicles on the right and left. Four seconds before impact, Maldonado turned on his turn signal. He blinked four times while his Explorer was in its original lane. There was a fifth flash as his truck straddled the rails. In court documents, Maldonado said he had noticed the Tesla approaching quickly in his rearview mirror and tried to veer back.
For most of the video, the Tesla maintained a speed of 69 mph (111 km / h), but just before impact it briefly increased to 70 mph and then slowed in the final second, according to data from the car.
“This monitoring system is fundamentally weak because it is easy to fool and does not monitor very consistently.”
Raj Rajkumar, Professor at Carnegie Mellon University
Maldonado’s truck rolled over and crashed into a barrier, according to the police report. It had a broken windshield and a wrinkled roof, and the rear axle had come loose. The Tesla had a crumpled roof, its front end was smashed, its bumper was partially detached, and its windshield was cracked.
Jovani Maldonado was found face down on the shoulder of Interstate 880, with his blood pooled.