Viewing a single comment thread. View all comments

Thegarbagegamer97 t1_j954b19 wrote

Thus is the problem with AI driving. It knows the rules, it knows how to respond to things that are following the rules, however, it struggles to compensate for when something goes against what it considers to be the rules. That requires on the spot thinking which we still seemingly have a ways to go before being able to replace humans for.

68

Perfect-Height-8837 t1_j957xuy wrote

This is the main reason I want to buy a Tesla. It doesn't even need to be in self driving mode, but the Media will report my death to the world just because I was driving a Tesla.
No other car manufacturer can offer this level of post-mortem notoriety.
You never read headlines such as "some nobody dies when his Skoda crashed into a firetruck."
But put Mr Nobody in a Tesla and he's worth reporting about.

90

writingt t1_j95n96j wrote

People crashing their cars has been happening for over a hundred years and as horrible as it is is nothing new. A car crashing itself and killing its driver is much more newsworthy.

30

Velocity_LP t1_j968du1 wrote

> A car crashing itself and killing its driver is much more newsworthy

are you confusing this article with some other incident? or did you find another source i must’ve missed? there’s zero mention of use of autopilot or anything else in this article that suggests the car caused the accident

11

Perfect-Height-8837 t1_j96zlk6 wrote

I think you may have fallen for the false assumption the title was leading you to. They want you to assume Autopilot only for you to learn it was just a bad driver. Therefore, Tesla is still good.

−1

LogicisGone t1_j95o6o7 wrote

I mean my local news pretty much covers every car accident with a death and I don't remember a single instance of anyone hitting a parked fire truck ever. Let alone also killing someone and injuring 4 firefighters. I would actually call this newsworthy.

12

Perfect-Height-8837 t1_j96zahq wrote

Do they mention the make of car in the headlines?
Usually not.

Funny thing is, Murdoch's son is on the board of directors at Tesla.

1

jedi_trey t1_j95sudx wrote

Well that's some hard hitting data

−2

hwangjae45 t1_j95flpf wrote

https://youtu.be/jiKzoO3tuSw

They say in the video they do not know if auto pilot was on, I’m not a Tesla advocate or anything but let’s wait until the facts are out.

16

Neospecial t1_j95gjji wrote

Isn't it always "OFF"? As in intentionally turning itself off second before a crash to avoid liability? I don't know and don't care to find out, just something read or heard somewhere at some point.

I'd not trust an AI driving me regardless.

18

hwangjae45 t1_j95gyg8 wrote

From what I know Tesla cars have a record of when it turns it’s auto pilot on and off, and from what I’ve seen it seems I think that it records that it is on. With that said I think Tesla had a recall due to their auto pilots, so it does seem to be a huge problem.

2

razorirr t1_j96c9l7 wrote

Nah. NHTSA requires reporting of all accidents up to 30 seconds after it turns off.

So if you think its turning off to not get counted, that means you think its not able to avoid crashing, but is able to realize its going to crash a half mile up the road, turn itself off, which it notifies you its doing, then the driver ignores the minority report self turn off, does not take over, and crashes.

2

TenderfootGungi t1_j97bg88 wrote

They were caught turning it off a split second before most crashes, and then stating something like "the auto pilot was not engaged". In many cases it was, less than a second before the crash, though. They has now started asking if it was engaged so many seconds before a crash (e.g. 10 seconds, but cannot find the exact time).

−1

GarbageTheClown t1_j97l6q1 wrote

You have a source for that? For as long as I remember they count anything within the last 5 seconds, it's on their website.

1

JohnPlayerSpecia1 t1_j95i94u wrote

not to worry, Tesla black boxes will always "turn off" autopilot just seconds before any crash to shift blame away from Tesla.

7

ryan_m t1_j95urf9 wrote

They absolutely do not and this gets repeated constantly. Tesla counts any crash that happens within 60 seconds of AP/FSD disengaging, which is longer than the NHTSA requires.

−8

code-sloth t1_j95z1ur wrote

It gets repeated constantly because it's true.

https://www.washingtonpost.com/technology/2022/06/15/tesla-autopilot-crashes/

> Tesla's vehicles have been found to shut off the advanced driver-assistance system, Autopilot, around one second before impact, according to the regulators.

4

ryan_m t1_j95zsyt wrote

Read the claim I responded to fully and then read what you posted. The first half is true that it turns off, but the core of the claim (that it is done to shift blame away) is entirely bullshit, because the cutoff for reporting is 30 seconds, and Tesla counts a minute before.

It makes sense that autopilot will shut off before a crash if you think about it for more than a couple of seconds. What behavior do you want a system like that to have when it encounters a situation it can’t handle? It should alert the driver and disengage. If you’re being a responsible driver, you should be paying attention the entire time anyways and ready to take control to specifically avoid things like this.

The anti-Musk circlejerk has gotten so insane at this point that people are no longer thinking about what they’re saying.

6

Raspberries-Are-Evil t1_j96hp1d wrote

It doesn't matter if driver was using autopilot. The driver IS RESPONSIBLE. Tesla's are not "self driving." Self driving is not legal yet- the driver IS responsible.

1

smoke1966 t1_j96okqm wrote

always number one problem with programing: making the list of EVERY possible scenario that could happen. You have to think of anything that could possibly happen ahead of time.

2

Walui t1_j974x5s wrote

Lol that's the whole point of machine learning. You have no idea what you're talking about.

1

smoke1966 t1_j9790e7 wrote

if it is programmed to learn correctly.. I've done programming and there's always the one thing you forgot.. If you don't believe that it's just a prime example of the problems with these cars.

1

GarbageTheClown t1_j97lf96 wrote

ML and traditional programming are not really in the same bucket.. and no, it doesn't work like that.

1

whyvas t1_j956v73 wrote

Where does it say the car had full self driving enabled?

−14

kuahara t1_j956wrq wrote

AI was not navigating this vehicle; a human was.

Also, we didn't need to know the make of the vehicle. The outcome would have been the same in a Chevy, Ford, BMW, etc... reporting agency is trying real hard for a specific reaction to a completely irrelevant detail.

−29

Thegarbagegamer97 t1_j957au6 wrote

Unless you have a source, all articles i can find point to it not being said one way or the other currently

27

niceguybadboy t1_j955p98 wrote

But...for every dumb mistake an AI driver makes, human drivers probably make fifty.

−32

Thegarbagegamer97 t1_j95684x wrote

Likely so, but in cases of mistakes like this, if the self driving was in use, i have to go with human drivers simply for the ability to make sense and at least attempt to navigate/slowdown/stop in response as needed. The AI seems in the beginning stage of development where it says “hmm, that looks like road so full speed ahead”, which most people, assuming they aren’t distracted, or under the influence of some substance will see flashing lights and a big object blocking the road and try to navigate around the firetruck, not plow through it like their in some action film. Some day im sure self driving will get there, but we are a LONG way off

7

ViciousNakedMoleRat t1_j958sau wrote

With automated driving the question is simply: How much more do we value an overall reduction of crashes compared to having to live with crashes that a driving human would've easily avoided?

On a societal level, we should theoretically be in favor of self-driving cars as soon as they cause fewer crashes than human-operated cars – even if it's just a couple of percent.

However, on a personal level, it probably takes a much more significant rate to convince many individual drivers. That's because a vast majority of drivers thinks of themselves as above average.

The perceived stupidity of automated driving accidents, like driving straight into objects or coming to a stop in the middle of the road makes them particularly likely to be picked up by the media, which raises the exposure of people to these issues. The hundreds of daily crashes caused by inattention or other human error just slide by without being noticed.

This causes a similar situation as fear of flying. It's much safer to fly than to drive a car, but plane crashes become huge news stories, which causes some people to develop an irrational fear of flying, while having no issue with driving.

−9

Thegarbagegamer97 t1_j959416 wrote

Self driving will be a wonderful thing one day, but when it has the potential to break laws by ignoring the rules of certain areas of roadway, and to plow straight on into a stalled out or stopped vehicle like theres nothing there, i think ill hold off on it just a little longer and keep my personal judgement capabilities. Humans arent perfect drivers, i dont expect AI to be either. But id rather prefer not having to babysit the entire time simply because it can have a tendency to suicide rush a fire truck or go straight from a turn only lane.

12