Viewing a single comment thread. View all comments

MidwestAmMan t1_j95ptb1 wrote

We need to know if it was using FSD

13

Raspberries-Are-Evil t1_j96hs56 wrote

Why does it matter? The Driver is responsible.

5

BasroilII t1_j96kk39 wrote

Absolutely agree that ultimately it's the driver's responsibility.

It's more to shut up the nits that are making this 100% about self-driving (or just dogpiling tesla) without knowing the actual cause.

2

vbob99 t1_j9eyxld wrote

Or those defending self-driving, assuming it was off in this crash without knowing so.

1

BasroilII t1_j9gnava wrote

Given the track record that is the more likely. However, I'm willing to wait and see what the actual fault was either way.

1

vbob99 t1_j9i751t wrote

I'd say it's about equally likely both ways.

1

MidwestAmMan t1_j978fza wrote

It’s a sticky wicket tbh. The Tesla-over-the-cliff all survived story was incredible. Teslas are clearly much safer on average. But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

If humans are a greater risk than FSD maybe FSD can be modified to require the driver take over when approaching emergency vehicles. But we need to know if FSD was engaged here.

1

Raspberries-Are-Evil t1_j97fv7z wrote

> But sudden braking, battery fires and “FSD” causing striking of emergency vehicles are woeful concerns.

As a Tesla owner myself, I understand that I am in control of the car at all times. This is no different than some idiot on cruise control slamming into a stopped car in front of him.

FSD requires your hands to be on the wheel. In fact, every 30 seconds or so, it remind you and if it doesn't detect your hands on the wheel by making a slight move to the wheel, it will disengage.

So even IF driver was using FSD, its his fault for not slowing down when approaching a fire truck.

3

GarbageTheClown t1_j97kt4n wrote

if FSD knew that it was approaching emergency vehicles then it would know it needed to stop. The problem is it doesn't know it's approaching emergency vehicles.

2

WirelessBCupSupport t1_j98kopd wrote

I watched this on the news. They couldn't determine as the driver died on the scene, but the passenger was alive and lifted to the hospital. And the firefighters that were there, said this isn't the first time they were hit. And, why dealing with the crash, they almost got hit again!

3

razorirr t1_j966747 wrote

There have from June 2021 through may 15th 2022 been 2 crashes with ADAS of any type, from any brand into emergency vehicles.

https://www.nhtsa.gov/sites/nhtsa.gov/files/2022-06/ADAS-L2-SGO-Report-June-2022.pdf

So you can either believe the government which is forcing the makers to publish this data that this is a non issue. Or you are just here talking about it cause you don't like autonomous vehicles and you are being dishonest with your comment anyways.

0

YEETMANdaMAN t1_j96baua wrote

I love how every time you post that link people stop replying to you. I thought everyone trusted the NHTSA’s collision and recall data?

−1

razorirr t1_j96cpv8 wrote

people only trust government reports when it backs what they think. the second it doesn't, they don't trust the government.

−2