Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
Editorials & Other Articles
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
Economy
Related: About this forumLawsuits test Tesla claim that drivers are solely responsible for crashes
Lawsuits test Tesla claim that drivers are solely responsible for crashes
Multiple civil cases -- and a federal investigation -- contend that Teslas technology invites drivers to overly trust the automation.
By Trisha Thadani
April 28, 2024 at 8:00 a.m. EDT
{Snip video}
Dash-cam footage from June 2023 shows a Tesla traveling south on the northbound side of a highway in Tennessee. (Video: Obtained by The Washington Post)
SAN FRANCISCO As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the companys most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public. ... At least eight lawsuits headed to trial in the coming year including two that havent been previously reported involve fatal or otherwise serious crashes that occurred while the driver was allegedly relying on Autopilot. The complaints argue that Tesla exaggerated the capabilities of the feature, which controls steering, speed and other actions typically left to the driver. As a result, the lawsuits claim, the company created a false sense of complacency that led the drivers to tragedy.
Evidence emerging in the cases including dash-cam video obtained by The Washington Post offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla. In Tennessee, an intoxicated man allegedly using Autopilot drives down the wrong side of the road for several minutes before barreling into an oncoming car, killing the 20-year-old inside.
Tesla maintains that it is not liable for the crashes because the driver is ultimately in control of the vehicle. But that contention is coming under increasing pressure, including from federal regulators. Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the automation has greater capabilities than it does.
Meanwhile, in a surprising twist, Tesla this month settled a high-profile case in Northern California that claimed Autopilot played a role in the fatal crash of an Apple engineer, Walter Huang. The companys decision to settle with Huangs family along with a ruling from a Florida judge concluding that Tesla had knowledge that its technology was flawed under certain conditions is giving fresh momentum to cases once seen as long shots, legal experts said. ... A reckoning is coming as more and more of these cases are going to see the light of a jury trial, said Brett Schreiber, a lawyer with Singleton Schreiber who is representing the family of Jovani Maldonado, 15, who was killed in Northern California when a Tesla in Autopilot rear-ended his familys pickup truck in 2019. ... Tesla did not respond to multiple requests for comment on the lawsuits.
{snip, including more videos}
By Trisha Thadani
Trisha Thadani joined The Washington Post in 2023 from the San Francisco Chronicle. She covers the technology industry. Twitter https://twitter.com/TrishaThadani
Multiple civil cases -- and a federal investigation -- contend that Teslas technology invites drivers to overly trust the automation.
By Trisha Thadani
April 28, 2024 at 8:00 a.m. EDT
{Snip video}
Dash-cam footage from June 2023 shows a Tesla traveling south on the northbound side of a highway in Tennessee. (Video: Obtained by The Washington Post)
SAN FRANCISCO As CEO Elon Musk stakes the future of Tesla on autonomous driving, lawyers from California to Florida are picking apart the companys most common driver assistance technology in painstaking detail, arguing that Autopilot is not safe for widespread use by the public. ... At least eight lawsuits headed to trial in the coming year including two that havent been previously reported involve fatal or otherwise serious crashes that occurred while the driver was allegedly relying on Autopilot. The complaints argue that Tesla exaggerated the capabilities of the feature, which controls steering, speed and other actions typically left to the driver. As a result, the lawsuits claim, the company created a false sense of complacency that led the drivers to tragedy.
Evidence emerging in the cases including dash-cam video obtained by The Washington Post offers sometimes-shocking details: In Phoenix, a woman allegedly relying on Autopilot plows into a disabled car and is then struck and killed by another vehicle after exiting her Tesla. In Tennessee, an intoxicated man allegedly using Autopilot drives down the wrong side of the road for several minutes before barreling into an oncoming car, killing the 20-year-old inside.
Tesla maintains that it is not liable for the crashes because the driver is ultimately in control of the vehicle. But that contention is coming under increasing pressure, including from federal regulators. Late Thursday, the National Highway Traffic Safety Administration launched a new review of Autopilot, signaling concern that a December recall failed to significantly improve misuse of the technology and that drivers are misled into thinking the automation has greater capabilities than it does.
Meanwhile, in a surprising twist, Tesla this month settled a high-profile case in Northern California that claimed Autopilot played a role in the fatal crash of an Apple engineer, Walter Huang. The companys decision to settle with Huangs family along with a ruling from a Florida judge concluding that Tesla had knowledge that its technology was flawed under certain conditions is giving fresh momentum to cases once seen as long shots, legal experts said. ... A reckoning is coming as more and more of these cases are going to see the light of a jury trial, said Brett Schreiber, a lawyer with Singleton Schreiber who is representing the family of Jovani Maldonado, 15, who was killed in Northern California when a Tesla in Autopilot rear-ended his familys pickup truck in 2019. ... Tesla did not respond to multiple requests for comment on the lawsuits.
{snip, including more videos}
By Trisha Thadani
Trisha Thadani joined The Washington Post in 2023 from the San Francisco Chronicle. She covers the technology industry. Twitter https://twitter.com/TrishaThadani
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
1 replies, 857 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (3)
ReplyReply to this post
1 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Lawsuits test Tesla claim that drivers are solely responsible for crashes (Original Post)
mahatmakanejeeves
Apr 2024
OP
dutch777
(3,501 posts)1. Bad enough with a driver who is supposed to be watching but who takes responsibility for the robo-taxi w no driver?
Will be very interesting to see where these lawsuits go. Given the thousands of permutations in driving and road conditions, how is a driver using self driving mode going to know when the car crosses the line into territory in which it is ill suited? The roads are dangerous enough with poor drivers and you want to couple that with poor robo drivers? Unless the technology proves near perfect in all conditions and be responsible for the outcome you let people drive and be responsible. I am kind of surprised some states have not already just said "No self driving mode allowed". Of course I think the government allowing crypto currency to compete with fiat currency is stupid, so what do I know?