Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

Caribbeans

(976 posts)
Wed Aug 21, 2024, 04:46 PM Aug 2024

I Took a Ride in a 'Self-Driving' Tesla and Never Once Felt Safe - RollingStone



I Took a Ride in a ‘Self-Driving’ Tesla and Never Once Felt Safe

The tech in Elon Musk’s electric vehicles is supposed to prevent accidents, but in several cases, it nearly caused one

RollingStone.com | Miles Klee | AUGUST 19, 2024

WHEN THE “FULL Self-Driving” setting is enabled in a Tesla, according to the automaker’s own description, the car “attempts to drive to your destination by following curves in the road, stopping at and negotiating intersections, making left and right turns, navigating roundabouts, and entering/exiting highways.”

“Attempts” would be the crucial word here, as I learned during an occasionally harrowing demonstration of FSD around surface streets and freeways in Los Angeles. While it’s true that the technology manages to impress at first, it doesn’t take long for severe and dangerous shortcomings to emerge. And, contrary to claims from exaggeration-prone Tesla CEO Elon Musk, it certainly didn’t seem safer than having an average human driver at the wheel.

One morning in early August, I hop into a 2018 Tesla Model 3 owned by Dan O’Dowd, founder of the Dawn Project. Easily the most outspoken critic of Tesla’s so-called autonomous driver-assistance features, O’Dowd — a billionaire who also co-founded Green Hills Software and made his fortune developing secure, hacker-proof systems for the U.S. military and government — established the Dawn Project to campaign “to ban unsafe software from safety critical systems” spanning healthcare, communications, power grids, and transportation. For several years, Tesla has been his primary target; O’Dowd has orchestrated one safety test after another, mounted a single-issue campaign for Senate, and run expensive Super Bowl commercials to spread his warnings against the company’s FSD software.

My driver for the day’s ride-along — that is, the person who will babysit the self-driving Tesla to make sure it doesn’t kill us or anyone else — is Arthur Maltin of Maltin PR, a London-based public relations firm that represents the Dawn Project and helps to amplify their consumer safety message. As soon as I see Maltin’s bandaged right hand, I ask nervously if it’s from an earlier collision, but he laughs and assures me it was an injury sustained from a fall off his bike. We set out east on Sunset Boulevard with FSD engaged, Maltin with his hands poised right over the wheel to take manual control if necessary...more
https://www.rollingstone.com/culture/culture-features/self-driving-tesla-drive-1235079210/



Tesla has been selling "Full Self Driving" lie for years. If any other company did this there would be outrage followed by action by the NTSB and the NHTSA. But somehow this buffoon Musk is allowed to repeatedly lie and endanger the public. One say we will know why.

Mercedes has Level 3 autonomy on limited roads now, beating the company that has defrauded thousands of people.

6 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
I Took a Ride in a 'Self-Driving' Tesla and Never Once Felt Safe - RollingStone (Original Post) Caribbeans Aug 2024 OP
Sounds like a nightmare Blue Owl Aug 2024 #1
I do not trust Musk or Tesla LetMyPeopleVote Aug 2024 #2
i thought one had to have their hands on the sterring wheel at all times ? AllaN01Bear Aug 2024 #3
Can't be done. Not with existing technology. paleotn Aug 2024 #4
Hated it. Too slow, too unpredictable, and liable to run into things. CoopersDad Aug 2024 #5
The full article is definitely worth the read - it makes mistake after mistake after mistake ... but I only tried one progree Aug 2024 #6

paleotn

(19,187 posts)
4. Can't be done. Not with existing technology.
Wed Aug 21, 2024, 05:23 PM
Aug 2024

Probably never with silicone chips. The task is too complex and you can only cram so many circuits on a silicon chip. Even the vastly more complex and powerful biological computer sitting on our shoulders has trouble at times.

CoopersDad

(2,876 posts)
5. Hated it. Too slow, too unpredictable, and liable to run into things.
Wed Aug 21, 2024, 08:08 PM
Aug 2024

Worst performance was around curbs and obstacles at parking speeds.
And no sense of potholes, or ability to steer through smoother areas of pavement where needed.

I prefer just doing it myself but appreciate that if I'm inattentive and the car sees a need to brake, it will brake, but that capability doesn't require FSD.

progree

(11,463 posts)
6. The full article is definitely worth the read - it makes mistake after mistake after mistake ... but I only tried one
Wed Aug 21, 2024, 08:12 PM
Aug 2024

video (embedded in the article itself) and it kept jamming and I had to give up. Though I saw the early part where it entered a clearly posted "Do Not Enter" road and then went past the yellow school bus with the extended stop arm and flashing red lights, and hit the child running across in front of the bus.

I didn't know it at first, but they then explained it had all been set up in advance, and the "child" was a mannequin being pulled across the street.

Latest Discussions»Issue Forums»Environment & Energy»I Took a Ride in a 'Self-...