Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is the first of many such cases to come.

It's unreasonable to hold someone accountable for a "self-driving car" that suddenly decides at a split-seconds notice that it can't cope driving.

Of course this is extra bad because it's an experimental car, but it's the same in my opinion with those Teslas on the road now that do the same thing.



> It's unreasonable to hold someone accountable for a "self-driving car"

It is not unreasonable at all. She had one job to do - look at the road. She failed it because she felt that her entertainment was more important than doing her job. She picked up her phone and started streaming videos. She failed at her one job, plain and simple. She knew everything about the job and still chose to watch some videos and risk lives.


I said it was slightly worse in this case and sibling commenters have addressed this issue.

The experimental car shouldn't have been on the road at all if the only thing separating it from killing people is someone who is expected to maintain concentration for hours/days while simultaneously not actually doing anything.


How do you expect the humanity to create a self-driving car then? Somehow magically create it from 0% to 100% in a lab, and then let out on the street? This is never going to work, it needs testing on public roads. What is your plan then? How do you propose FSD cars should be developed?


For starters, Uber could have just not disabled the existing failsafe mechanisms that the car had.

Then, if we start with the expectation that a human in the loop is necessary for live testing, we could have two. It reduces the impact of independent failures. It also adds some social pressure to avoid negligence. It also provides overlapping coverage, as a lifeguard in this thread has pointed out.

Humans are fallible. Accept that, and design systems to be safe despite individual errors.

Driving a car is not the same aa monitoring a self driving car. There will be differences in attentiveness. Stop equating them.


Lets not act like it's a self-fulfilling prophecy that FSD cars on existing public roads will exist.

In my opinion, safe FSD cars are not possible with current technology on existing public roads - so either the technology has to improve by orders of magnitude, or the roads need to be modified significantly.

The gung-ho experimentation that is going on in public is in my opinion very dangerous and should be stopped, and this case is a perfect example of why.

Both the road infrastructure and other drivers are too crappy and unpredictable for FSD cars to be viable and safe.

Also this business of holding someone who is not actually controlling the vehicle accountable is ridiculous, and is surely something which will be proven in court eventually.


What I find puzzling is that I keep hearing people say that it's impossible for people to stay alert while not being the driver, when there's actually a pretty popular term to describe that very thing: a backseat driver[1].

If you go on a long multi-day road trip, you might even end up relying on a passenger's feedback in a moment of tiredness.

[1] https://en.wikipedia.org/wiki/Back-seat_driver


Did you actually read that page? A backseat driver doesn't trust the actual driver.

>A backseat driver may be uncomfortable with the skills of the driver, feel out of control since they are not driving the vehicle

Are you saying that the meatbag in a "self-driving car" is equivalent to a backseat driver?

Because if you are, then "self-driving cars" will never be viable because the humans inside don't actually trust them.


Trust is beside the point. The actual point is that they are able to pay attention to the road, even for prolonged amounts of time, and react to perceived danger, despite not being the ones who are actually in control of the car.

The claim I'm disputing is the one that says people are somehow incapable of paying attention to the road for extended periods of time unless they are in control of the vehicle.

I might agree that a reaction might be jerky and panicky, but then again, they would by definition be so regardless, due to the unexpected nature of accidents


So other road users' safety is reliant on jerky and panicky responses to perceived danger as well as real emergencies.

That sounds like a nice safe system.

For the record, I don't agree. I feel that the average person who would choose to use a self-driving will tend to be overly relaxed and trusting, not what I'd call a backseat driver.

The kind of person who is a backseat driver wouldn't trust the car's driving and would prefer to control it themselves, and thus wouldn't be using the self-driving feature in the first place.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: