Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A lot of people in here saying it is not possible to drive safely with partial self driving. I wonder, how many of those people have actually driven a car with autopilot?

I have autopilot on my car, and it definitely makes me a better and safer driver. It maintains my distance from the car in front and my speed while keeping me in my lane, so my brain no longer has to worry about those mundane things. Instead I can spend all my brainpower focused on looking for potential emergencies, instead of splitting time between lane keeping/following and looking for emergencies.

I no longer have to look at my speedometer or the lane markers, I can take a much broader view of the traffic and conditions around me.

Before you say it's impossible to be safe driving with an assistive product, I suggest trying one out.



I would argue that partial self driving is an irresponsible product not because it's impossible to drive safely with it, but because so many people will use it as an excuse to pay little to no attention to the road. If you personally are a responsible driver and even a better driver with it, that's great - but most people probably aren't going to use it the same way, especially those without much of an understanding of the technology - and especially given the way that Tesla markets it.


The technical term for this is Risk Compensation:

"Risk compensation is a theory which suggests that people typically adjust their behavior in response to perceived levels of risk, becoming more careful where they sense greater risk and less careful if they feel more protected."


Reminds me of this kind of thing:

https://usa.streetsblog.org/2017/09/13/wide-residential-stre...

I was first introduced to "wide streets in neighborhoods are more dangerous than narrow" on HN years ago. (I don't think it was the linked article, but that was the first one that came up just now after a search :P )

Since having read that, I've actually noticed how true this is, at least to me anecdotally. When I'm driving in a neighborhood with crowded streets, I can't bring myself to go over 15MPH, much less over the speed limit (typically 25 in neighborhoods in the US).

Wide streets give a sense of security. So I feel like people are less likely to pay attention going around bends, parked cars, etc, than if they didn't have that sense of security.


Also moral hazard, kinda.


I trust someone posting here to driver safely and safer with it.

Forget the average, how about the bottom 10-20% of all drivers? I don't trust the bottom 10% driving with "Autopilot" at all, zero. They are going to use it to go on autopilot while driving, exactly as the marketing implies. I mean there has to even be people who think the car itself is conscious. Car has advanced AI, must be conscious.

To think otherwise is just highly underestimating how clueless some people are.


I don’t trust those people without Autopilot either. Is that the point?


Yes there have been stories about irresponsible people. Do you have any evidence that this is the common case? The aggregate evidence seems to suggest reduced accidents and reduced fatalities.


There was a study that adaptive cruise control and lane assist leads to more people speeding: https://www.iihs.org/news/detail/adaptive-cruise-control-spu...

They then use a "common formula" to show that that leads to more fatal accidents, but didn't actually study on actual crash data.


Absolutely not true as a blanket statement. Maybe if the driver monitoring is so lax that you could conceivably trick the car into poorly driving itself, but the system I use, Comma [1], has incredibly strict driver monitoring.

There is absolutely no doubt I'm a safer driver with Comma than without it. I'm still in control, but Comma not only allows me to expend less effort driving (which allows me to stay alert over longer periods of time), but also be much less emotional when driving. I'm pretty convinced that a large percentage of accidents are caused by frustrated or bored drivers doing crazy things that you just don't feel the urge to do with the assistance of self-driving.

1: https://comma.ai/


I use the same system as you do, and I've noticed that if you mention that system's name, you tend to get downvotes. I haven't yet figured out why, not sure if there is a bot or just a lot of Tesla fans who downvote the mention of our system.

Edit: After one minute I got a downvote.


It sounds like you're advertising it. "The future can be yours, today. For the introductory monthly price of 79.99. Sign up here[1]"


This doesn't even make sense. Simply mentioning the name of a product I use is not advertising. Otherwise, is every person here who mentions Tesla advertising too?


> Simply mentioning the name of a product I use is not advertising.

Sorry, I actually meant to refer to Birken's comment above. Advertising might not have been the best word - astroturfing? If you read their comment but replace "comma" with "tesla" it still reads as spammy.

Yours was fine (though discussing downvotes will always get you downvotes, my comment included).

> Otherwise, is every person here who mentions Tesla advertising too?

Only the ones who needlessly sing the praises of the Tesla autopilot in barely-related threads.


Well said, that last bit especially. The regulations on medical devices are on how the manufacturer markets it. Should be the same for driving technology.


> so many people will use it as an excuse to pay little to no attention to the road

I guess that we have to look at the results here to judge whether too many people are not paying attention. Hopefully the investigation will reveal whether the autopilot incidents of collision with emergency vehicles is significantly more frequent or less frequent than from vehicles being driven in the traditional way.


This is a question that is answerable with the right data – we can just see if it's safer or not.


Doesn't the data show that cars with assistive technologies are in fewer non-fatal and fatal accidents?


Tesla marketed Autopilot != responsibly implemented assistive safety systems.


It looks like the federal government is beginning to collect and analyze relevant data, which will be interesting.

https://www.latimes.com/business/story/2021-06-29/nhtsa-adas...

Tesla released data in the past, but that’s quite suspect as they have an obvious agenda and aren’t known for open and honest communication.

https://www.latimes.com/business/autos/la-fi-hy-tesla-safety...


I once talked to a guy who bragged about having Autopilot drive him home when he's drunk


One would think that with "autopilot" there would be a limit to speed and an increased "caution distance" the vehicle maintains with everything.

I also think there should be dedicated lanes for self driving cars..

A very good friend of mine was a sensor engineer at google working on virtual sensors that interacted with hand gestures in the air... and is now a pre-eminent sensor engineer for a large japanese company everyone has heard of...

We drove from the bay to nprthern california in his tesla and it was terrifying how much trust he put into that car. I got car sick and ended up throwing up out the window...

Knowing what I know of machines working in tech since 1995 -- I would trust SHIT for self-driving just yet.


Something tells me the majority of people with partial self driving aren't using it as a means of staying more focussed on the road. There's a pesky little device buzzing around in everyone's pocket that is more likely the recipient of this newfound attention.


I do like using Tesla's Traffic Aware Cruise Control, but autopilot takes over enough of the driving I know I can't pay meaningful attention for more than 10 or 15 minutes. I just did a 1600 km round trip with traffic aware cruise control on the vast majority of the time, but I didn't turn on autopilot once.

Autopilot bothers me for a number of reasons. Fundamentally it's a poor driver, spending time in people's blind spots unnecessarily, braking and accelerating in rather abrupt ways, and just generally acting like a teenage driver who just got their license. It simply doesn't practice defensive driving.

I also spend a lot of time driving on undivided rural highways. These are highly dangerous roads, with closing speeds in excess of 200 km/hr at times. In those situations autopilot drives far too much according to the strict rules of the road and can't adjust to the situation. It doesn't use the lane space to leave additional room and it doesn't give a wide berth to cyclists.

It also bothers me deeply that one of the ways to override autopilot is to make steering input and that's also the indicator Tesla uses to determine the user participating. I haven't used autopilot much for the reasons above, but the few times I did active it the steering input required to tell the car I'm there is also enough steering input to move the car several feet in the lane due to very direct steering. It feels like I'm just fighting the car. That is deeply unnerving since the force required to override autopilot feels like enough to jump nearly half a lane and cause a collision.


I don't drive a Tesla, I use a different autopilot system. The one I use isn't as aggressive as Tesla, so it doesn't exhibit these behaviors but also requires more manual takeovers. Also it uses facial recognition so you don't have to touch the wheel (but you can touch the wheel and steering manually doesn't disengage it, only the brake and gas do).


Since autopilot is a specifically Tesla term for their L2 driver assistance features, you may be better served in this thread referring specifically to the system you have. Most other L2 systems require so much more driver input and attention that many of the incredulous replies are likely assuming you're talking about Tesla specifically.

I'm guessing GM Supercruise since it's the only one I'm aware of that uses eye tracking in production (though Tesla claims to have enabled that in the US just recently. Personally I'm not sure their camera placement can really do proper eye tracking). Supercruise's disengagement rate is low though, generally much lower than Tesla's.

I do like supercruise from what I've seen of it (haven't had a chance to actually use it since GM seems determined to waste their advantage by not rushing it into every car they make).


I don't mention the one I use because every time I do I get immediate downvotes. But I use the system from Comma.ai which is based on Openpilot, which refers to itself as open source autopilot.


> "* It maintains my distance from the car in front and my speed while keeping me in my lane, so my brain no longer has to worry about those mundane things.*"

Ive been driving for 25 years, cars, trucks, trailers, standard and auto transmissions, and I have never once thought to myself "I'd be such a better diver if I didn't have to pay attention to my speed, lane keeping or following distance" Why? Because those mundane things are already on autopilot in my brain.

Posts like yours are so absurd to met that I cant help but think shill.


I've been driving for 29 years, and I never thought those things either until I got autopilot (and I don't have a Tesla BTW, I have a different autopilot system). While those things were autopilot in my brain, they still took brain power. It's so much more relaxing not worrying about those things.

It's like people who do math by hand and then get a calculator.


Maybe you're a great driver then. Have you ever shared the road with someone who was a terrible driver?


Yes. Those are people who think they can go hands free, use their phone or watch a movie while on autopilot.


theyre also likely the same type of folks who fall for marketing like "full self driving" without investigating critically or even reading through the analog and digital shrinkwrap they have to tear through to get to their date or appointment or whatever on time.


> my brain no longer has to worry about those mundane things

I would be terrified to share the road with someone of this mindset. Your vehicle is a lethal weapon when you are driving it around (assisted or otherwise). At no point can someone claim that a tesla vehicle circa today is able to completely assume the duty of driving. You are still 100% responsible for everything that happens in and around that car. You had better have a plan for what happens if autopilot decides to shit the bed while a semi jackknifes in front of you.

The exceptions are what will kill you - and others - every single time. It's not the boring daily drives where 50 car pileups and random battery explosions occur. Maybe your risk tolerance is higher. Mine is not. Please consider this next time you are on the road.


This...is entirely the point OP is making. You get more brain power to watch out for the semi jackknifing into you, the car switching lanes without signaling, the truck about to lose a bucket or chair from its bed. This is stuff you may not catch when you're spending your brain power focusing on staying between the lanes and keeping your distance between the car in front.

When you automate away the mundane, exceptions are much easier to catch.


The problem is that it isn't automated away.

Self-driving cars will take that stuff over 99% of the time, but the 1% where it screws up is the dangerous part. There are plenty of examples where a self-driving car seemingly randomly seems to go completely haywire, without any obvious reason.

Instead of spending brain power on driving properly, you now have to spend brain power on looking at what you should be doing _and_ checking if the car actually agrees and is doing it.

Staying 100% focused without actually _doing_ anything is incredibly difficult. Many countries intentionally add curves to their highways to keep drivers alert: having a pencil-straight road for hundreds of miles really messes with the human brain.


> You get more brain power to watch out for the semi

That doesn’t help at all if you’re reading a book or playing on your phone, both of which are things I observe Tesla drivers doing pretty often when I’m in the Bay Area.


Do you get concerned about mathematical errors because the computer is doing the calculation instead of someone doing it by hand?

It's the same thing here. The computer is assisting me so that I can take care of the novel situations, the exceptions if you will. I can pay closer attention to the road and see that jackknifed trailer sooner because I wasn't looking at my speedometer to check my speed.

And I don't have a Tesla, I use a different autopilot system.


You're misunderstanding him at best and projecting at worst.

He's saying that he no longer has to worry about those things the same way cruise control lets you not worry about the speedometer needle and dedicate more of your attention budget outside the car.

Of course you can be an idiot and spend it on your cell phone but that's not really a failure mode specific to any given vehicle technology.


I drove a friend's Model 3, and within five minutes of driving on autopilot it got confused at an intersection and tried to make a pretty sudden 'lane change' to the wrong side of a divided road.

Obviously that's a single anecdote, and I don't know if it would have gone through with it because I immediately corrected, but that was my experience.


I bet it would have made that mistake.

The question is whether a system that absolutely requires that you pay attention going through intersections (which you should obviously do) is safer in aggregate than not having those features enabled at all in those situations.

E.g. are weird lane changes that people don't catch happening more frequently than people zooming through red lights because they weren't paying attention. Only the data can show that, and Tesla should share it.


I have a car that does those things as well, and I use it a lot... but it's not Tesla, and its manufacturer doesn't refer to it "autopilot" or "self driving", but rather "advanced cruise control".


Agreed. I have a rudimentary radar enhanced cruise control in my minivan and I've found it's really helpful for maintaining a safe stopping distance while driving.


Thats not what they bloody sell it as, though. Thats the key.


You might be giving too much credit to your ability to pay attention to your surroundings. It is possible that looking around as a passenger might actually increases risk. There's no way to tell other than looking at the data.

Personally, I tend to turn off things like lane keeping because I end up having to babysit them more than I would like. It doesn't always read the lanes correctly, though I have not tried Tesla's technology yet.


As someone who has used Autopilot extensively, I can tell you: you only have the illusion of enhanced safety. In reality, parts of your brain have shut down to save energy, and you've lost some situational awareness, but you can't tell that's happened.


If you can’t tell, how do you know this is the case?


i've rented model X with the latest FSD a few weeks ago and even simple things like lane detection are very inconsistent and unpredictable.

I don't know if this "AI" has any sort of quality control, but how difficult is it to test if it detects a solid white line on the side of the road in at least 6 out of 10 tries

it also tends to suddenly disengage and pass control to the driver at most dangerous parts of the trip e.g. when passing other car in narrow lane, etc.

This "driver assistant" is a series of disasters in the making.


It is possible that you've learnt to drive like that and that it works for you.

But, I feel that this depends on the type of driver and their personality. I, for on, have never felt comfortable with cruise controls, even adaptive ones, let alone partial self-driving. I had always found that I am more comfortable when I was driving rather than trying to make sure that the adaptive cruise control is able to make a complete stop in case of emergencies. Perhaps I'm just a little untrusting and paranoid :).


There's not a small part of your psyche that tells you it's ok to drive while tired, or all the way home from that Las Vegas trip because the technology is so good?


The thing is, I get a lot less tired when I'm driving now, because I get to focus only on the novel stimulus (possible emergencies) and not the mundane.

But no, I don't trust it to drive itself. If I'm tired I won't drive, regardless of autopilot.


> I no longer have to look at my speedometer

I'm curious about this part. Do you manually input a limit, or trust it to read street signs?

And how often do you look at your speedometer anyhow? I think on the highway I glance at it maybe once every few minutes and otherwise match speed with the other vehicles, and in the city I look more often but mostly just drive at what feels a safe pace (which seems to match the limits, more or less.)


I don't drive a Tesla, but the one time I did drive one it read the street signs. In my car I just set the speed manually.

It's true, I don't look at the speedometer all that often when driving manually, but it's just one less thing to worry about.


I think that's pretty reckless honestly, you put a lot of faith in the system being able to detect lane markers. Other than that I could see how adaptive cruise control can be nice, but it's also not hard to engage cruise control and fine tune your speed to the conditions by tapping up or down on the buttons on the wheel.


I don't put any faith in it at all actually. That's why I pay attention while it drives, looking for novel situations, which would include self driving errors, so I can correct them.


I tried it and found I was spending brainpower fighting the system, sending noise inputs wrt real world conditions in order to trick the system into not disengaging because it decided I was not "driving" it enough. The hacker in meet loved it, the rational person in me said, turn that off before it makes you crash!


For you, definitely. For the people I see reading while the car drives them, not at all.


I agree 100% with jedberg as to my own driving experience with autopilot. Works great, and I still pay complete attention because I don't want to die in a fiery crash. If you're not going to pay attention, driving a dumber car doesn't make it safer.


Survivorship bias?


That would only be relevant is a substantial portion of people who felt poorly about tesla autopilot had literally perished from it.


It was said in jest, but to your comment, no need to perish - just not be vocal about the negative feelings.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: