A 56-year-old Snohomish man had set his Tesla Model S on Autopilot and was looking at his cellphone on Friday when he struck and killed a motorcyclist in front of him in Monroe, court records show.
A Washington State Patrol trooper arrested the Tesla driver at the crash site on Highway 522 at Fales Road shortly before 4 p.m. on suspicion of vehicular manslaughter, according to a probable cause affidavit.
The motorcyclist, Jeffrey Nissen, 28, of Stanwood, died at the scene, records show.
The Tesla driver told a state trooper he was driving home from having lunch in Bothell and was looking at his phone when he heard a bang and felt his car lurch forward, accelerate and hit the motorcyclist, according to the affidavit.
The man told the trooper his Tesla got stuck on top of the motorcyclist and couldn’t be moved in time to save him, the affidavit states.
The trooper cited the driver’s “inattention to driving, while on autopilot mode, and the distraction of the cell phone while moving forward,” and trusting “the machine to drive for him” as probable cause for a charge of vehicular manslaughter, according to the affidavit.
The man was booked into the Snohomish County Jail and was released Sunday after posting bond on his $100,000 bail, jail records show.
this guy should get everything coming to him. FSD/autopilot is not good enough to take your hands off the wheel and not pay attention to the fuck’s going on around you.
that said. Tesla absolutely should get a massive wrongful death lawsuit and get fucked by the courts.
Everyone from the driver to whoever certified the car as road worthy to Elon Musk should be held responsible. In reality I would be surprised if anyone except the driver will even see the inside of the court.
yup. every one else has very expensive lawyers.
reality is Teslas are a shit product with false advertising.
I’m honestly not sure which is worse, that Tesla made a system they call Autopilot that isn’t an autopilot or that Tesla owners still think it is.
Tesla, for blatantly false advertising.
Yeah, but it was made clear years ago, even by Tesla themselves, that “autopilot” doesn’t actually mean “autopilot.” So maybe it’s 50/50?
They still sell it as “Autopilot and Full Self Driving”. Sure, they claim “Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment”, but that’s buried in the marketing text that no one reads. It really should be called “Enhanced Lane Assist with Auto-Follow Cruise Control”, but that doesn’t sell as many cars.
Edit: Warning…Lemmy or Kbin, I’m not sure which, is current buggy and showing my link as a video connected to RedGifs. Clicking the link actually brings you to the Tesla website. Don’t open the video…
No matter how they’re marketed and used, self-driving systems will make people less engaged (that’s the entire point, people don’t use it out of arm fatigue, they use it because it’s mentally relaxing!) and therefore more distracted.
“The driver should keep their full attention on the road and be prepared to take over at any point” is an impossible standard and a lame-ass loophole that shouldn’t even be allowed to be cited in a court of law. Fully engaged drivers do not ask an “autopilot” to steer for them.
When I tried it out, after your hands are off the wheel for several seconds, the screen flashes blue then autopilot disengages and the car comes to a stop. Not excusing what it can and can’t do, but you can’t really drive hands free without intentionally working around very obvious restrictions. The driver can’t claim ignorance
naw. I think the worst part is when, in tests because that’s the only places they can turn it on in most states, it’ll slow down for an object in the road, and then decide to floor it.
You can turn it on anywhere in the U.S. I’m not sure if it’s geolocked elsewhere. You might be confusing it with GM, Ford, Mercedes, and other systems which only work on certain stretches of certain roads.
FSD/autopilot is not good enough to take your hands off the wheel and not pay attention to the fuck’s going on around you.
What? There’s a video on Tesla’s website right now that says the driver is at the wheel only for legal reasons. There is no other purpose to have a driver.
I’m stumped!
Yeah. how many times did they say ‘Cybertruck is coming out this year’, only for it to not? I suspect they got it into production by mashing parts together because of consumer protection things coming into effect. they’re full of shit. and the “legal reasons” are so as to make the driver responsible and not them.
and you’re not even supposed to have a cellphone in use while driving in WA, that’s an automatic ticket… though the police have you catch you doing it first.
Same deal almost everywhere… but firsthand experience is that a significant portion of all drivers have their phone out.
Would love to see some proportional crash rates of autopilot use vs not autopilot use too. People focus on things like crash totals or death totals. 17 deaths is a tragedy to be sure.
That being said when the US has over 40,000 auto deaths per year… and this article is telling me only 17 deaths are in any way involved with Autopilot since 2019… I really wonder why this is somehow more outrageous than the ~240,913 other vehicle deaths in the US since 2019. Given that Tesla is about 5% of all autos in the US, I would expect tesla deaths to be about 12,000 deaths in that period, or 5%.
Are so few people using autopilot? Shouldn’t the autopilot death toll be something closer to the 2000 deaths per year one would expect statistically from Tesla Drivers?
Is autopilot much safer than human drivers? Is it more dangerous?
Is Autopilot + Attentive safer than just attentive?
Is the 40k deaths per year not something that should be considered simply because people stop thinking of so many deaths as a tragedy and just think of it as a statistic?
Is the outrage and focus on car self driving just an extension of human phobia of technology and articles allow for people to have anecdotal confirmation bias?
I wish there was some sort of safe harbor for publishing the details - whatever liability they deserve should not be affected by being honest and transparent. It would be very useful to know what the car detected or didn’t and what mode it was in, but I’m sure their lawyers will keep any details to a minimum for liability
If I ever get into a self driving car and let it do its thing, it’s going to be fucking open source.
There is zero chance in trusting tesla or any other giant corpos with my life, safety and the life and safety of everyone around me
What’s the actual point of “autopilot” if you have to pay full attention and be ready to take control at a moments notice?
Sounds like… driving. 🤔
It’s poorly named. A more accurate, less marketing influenced name would be “Adaptive Cruise Control with Lane Assist” for the basic “Autopilot”.
I think it’s more fair to say it is maliciously named. It’s false advertising and leading to issues exactly like the one mentioned in this post.
But Elmo would never do such a thing /s
ambitiously named.
Which my parents’ Lexus had like 7 years ago.
If you try using adaptive cruise control and lane assist functions as a method to keep your hands off the wheel you’re going to be in for a bad time.
I’m sure it’s not all cars, but all the ones i’ve been in over the past 10 years generally only jerk you back to the middle of the lane. They don’t adapt well if you’re cut off suddenly at high speeds either.
I find it much smoother and more consistent than my Subaru was, but clearly NOT ready for hands free. If it’s only “autopilot”, it really is just a nicer adaptive cruise control with lane keeping.
There’s a pretty important distinction here: people using autopilot and full self driving interchangeably when they are very different things. It’s important to know which.
- autopilot is mostly a better adaptive cruise control. You shouldn’t expect to be hands off anymore than any other adaptive cruise control
- full self driving is much more capable and may tempt people to be hands off. In ideal conditions it can literally do all the driving. However it’s not yet ready for all the less than ideal conditions, plus they have feedback to keep your hands on the wheel. Whether or not you think this is falsely advertised, you can not be hands off without working around the sensors. The driver can’t claim ignorance
Modern cruise control makes it much less taxing to drive. You can focus only the necessities while leaving things like lane centering and maintaining a proper distance up to the ecu.
Tesla fsd is really just advanced cruise control. The problem is you can’t program out the idiots, and Tesla’s fsd should be considered advanced cruise control and not imply that the operator doesn’t need to pay attention.
But if it’s marketed to change lanes, adjust speed, avoid obstacles, stop, signal, and everything else a driver does… then it’s being marketed as far more than “advanced cruise control”, is it not?
Quite literally their website says: “Tesla cars come standard with advanced hardware capable of providing Autopilot features, and full self-driving capabilities.”
“The system is designed to be able to conduct short and long distance trips with no action required by the person in the driver’s seat.”
“When you arrive at your destination, simply step out at the entrance and your car will enter park seek mode, automatically search for a spot and park itself. A tap on your phone summons it back to you.”
They are telling you the car will drive without someone even being in it!
Why are they even allowed to get away with this kind of marketing? Getting people killed along the way.
*May run over a Grandma just trying to walk into Trader Joe’s.
“You must agree to the TOS before driving this vehicle.”
I agree that’s is marketed as fully autonomous and it shouldn’t be. I think the states dmv should have stepped in and not allowed a vehicle to be registered as anything but having cruise control unless they OK’d it because there are idiots behind the wheel that are simply ignorant of the fact that they are moving multiple tons of mass at speeds that are faster than they can react.
- autopilot is similar to cruise control with lane keeping
- full self driving can in theory do all the driving
- regardless of who was driving or should have been, why didn’t obstacle avoidance avoid the obstacle.
I think y’all are focusing on the wrong feature in this case. Regardless of the limitations of automated driving, or whether it was human or computer doing the driving, obstacle avoidance is meant to prevent hitting things
I agree. You should see the tests of these cars slamming into pedestrians. Why they are allowed to be on public roads is beyond me.
Fsd? You mean the service tesla itself named “Full self driving?”
Sure seems like the company is very intentionally misleading its customers, no matter how many disclaimers they have added over the years as more and more people get killed by their cars.
Your point will have more merit when Tesla drops that dangerously misleading name. Until then, they are partially culpable.
I agree. The state should stop allowing new teslas to be registered on their roads until that moniker is corrected. They should prevent advanced cruise control systems from being misleadingly labeled.
I agree there should be a law against it and sanctions, but Tesla is also capable of making these changes.
The fact they won’t tell you everything you need to know about how Tesla sees itself and its customers.
At the risk of giving you more ammo, there are two different levels
- autopilot is mostly a nicer adaptive cruise control with lane keeping. I find it works much better than my previous car, but is similar functionality
- full self driving is the more interesting level. In ideal conditions it can do all the driving, door to door. However it’s not yet ready for all the less than ideal conditions and you really need to keep on top of it. It may be tempting to try hands free BUT DONT
But also, there’s a more general question here. Regardless who is in control of the car or who should be, obstacle avoidance should have helped avoid running over a motorcyclist. We don’t know the scenario but if I’m approaching a motorcycle and the car gets worried, it sets off an alert. If I don’t fix it asap, the car hits the brakes that’s what should have happened.
What was this scenario?
- Was the driver overriding the accelerator?
- were the vehicles perpendicular, so there was no time to respond?
- did the car miss it?
It’s named after airplanes, airplane autopilot doesn’t do everything and you need to be ready to take control at a moments notice.
What’s the point of cruise control if you still have to pay attention? What’s the point of adding adaptive cruise control and lane assist if you still have to pay attention?
They’re all things that help alleviate some of the monotonous things one has to do while driving. Self driving also had the benefit of, in the future, completely relieving human drivers.
In fairness, cruise control is designed for maintaining a speed when you are just going straight for a long time, in a situation where other vehicles are going the same speed (i.e. on a highway). Cruise control isn’t designed to navigate around pedestrians, turn lanes, approach intersections, or do anything else that would put people in danger.
Of course, you still have to know when to stop, but that would be during situations where cruise control would NOT be appropriate.
Tesla wants people to use these features in cities, where you’ve got kids and people walking around. Totally different, and I think they should be held accountable for how they’ve marketed these features.
Self driving also had the benefit of, in the future, completely relieving human drivers.
Yes, and no. The infrastructure would need to be designed for self-driving vehicles, or you get too many unpredictable variables that aren’t properly accounted for. As they are today, they shouldn’t be allowed on public roadways.
We had an autonomous bus one municipality over that ran off the road and hit a tree and critically injured the operator. God forbid this happened near a school. A human driver wouldn’t have done that unless they were impaired.
You claimed there was no point if you have to pay attention. I was responding to that and pointing out there are all kinds of things that currently assist in driving that still require paying attention. Self driving just replaced more of that, just as adaptive cruise control replaced more of that than cc itself.
Liability is a whole different question. although, I have to laugh the idea of humans not making mistakes
We’re talking about features that are intended to two very different purposes.
Cruise control is designed so that a driver doesn’t have to keep their foot pressed on a gas pedal for hours on end (causing physical discomfort or injury) if they are going a constant speed. You are still required to drive, so cruise control was never an alternative to driving.
But these marketed self-driving features are made to replace the act of driving, while still expecting that the person in the vehicle has their full attention and control over it when the car decides to break bad.
There’s a massive different, IMO.
Seems like “assisted-driving” might be a better term, even if it results in fewer sales. 😂
Again, “You claimed there was no point if you have to pay attention. I was responding to that and pointing out there are all kinds of things that currently assist in driving that still require paying attention.”
I agree with you that it should not be marketed as SD and that there is a massive difference between the two. But in the way I compared them, in response to the argument you made, those differences make no difference.
deleted by creator
a bunch of years ago, I was driving south on Aurora (in Seattle) and right after the bridge I witnessed a older smaller import drive up on a motorcyclist. I got out and helped tip the car over so we could pull the lady on the motorcycle out. The whole time the dude in the car was just being a fucking idiot. Wouldn’t help us at all. Wouldn’t open the doors or anything. So once we had enough people we just fucking tipped that car on its fucking side and waited for the cops.
That would have been impossible with the amount of low center of gravity weight the Tesla’s have.
What did the moron do after that? Just harumphed in their car until they got arrested?
Yeah he opened the door and climbed out once the cop arrived. I think he was afraid we’d linch him or something. Someone took his keys at one point. I remember screaming at him through the window to get out of the car and help us. But he just dead stares forward the whole time
Ah. Glad to know I can put a damper on my excitement for motorcycle riding season to be back…
Granted, this is possible with any car in any state. Just need to make sure I’m explicitly on the lookout for teslas driving behind me.
Be careful out there! I work in orthopedics and rehabilitation at my state’s only trauma 1 ward, and motorcycle MVA have always taken up spaces in the icu. However, since COVID there has definitely been an increase.
When I first started practicing the majority of bikers were in the icu because of their own behavior, whether that be unsafe driving or lack of protective gear. Now it seems everyone is getting mowed over at intersections by SUVs.
I think it’s a combination of more distractions for drivers, mixed with the ever growing size of American vehicles. We’re seeing the same with pedestrian injuries as well, vehicles are just too massive nowadays.
Granted, this is possible with any car in any state.
But it’s more likely in a car where the drivers may have been mislead into believing a myth that the car will drive itself safely without them.
If someone’s driving a Ford, they’d have to be certifiably insane to believe it’s ever safe to take your eyes off the road and hands off the wheel for long periods of time and expect to not have an accident. Insane to the degree they’d have never gotten their license.
If they’re in a Tesla, they just need to be a stupid consumer to believe that.
Blue Cruise from ford has autonomous self driving on some highways.
But it’s more likely in a car where the drivers may have been mislead into believing a myth that the car will drive itself safely without them.
I’d wager the driver knew full well that the car does not drive without them. While it is a very poorly and marketing influenced name (“Autopilot”); unless this was the drivers first time using it, and had only used it for 5 seconds before the accident they new perfectly well what the feature was.
You have to try to game the car for it to allow you to take your hands off the wheel. It’s pretty sensitive to movements and if your hands are off the wheel you get visible and audible alerts before the car disengages the cruise control / lane assistance.
This seems like a case of a reckless driver who killed someone and is attempting to push blame or form some excuse for their negligence. The driver not paying attention to the road is the danger here, no matter what car they’re driving.
Tesla drivers are quickly taking #1 spot for worst human AND “ai” drivers
When I got my first motorcycle I was in love … until I had to ride on city streets with assholes. Got to the point I was happier hopping on the back of someone else’s bike so I could just enjoy the ride.
Highways were, and still are, the best rides tho.
No wonder Tesla reduced the cost of this program. It’s absolute shite.
It’s the equivalent of Cruise at this point. Get this shite vehicle off the road already