The technology is not the only problem. The lawyers that are defending every dead body, and there will be dead bodies, that will sink self-driving cars. Even if everyone is 95% safer with self-driving cars, those that are killed by a self-driving car (in combo with a public that is easily swayed with non-objective arguments) will be hard to dismiss.
We could say that now about automobiles. If I were to get into an accident and then have lawyers for the other vehicle, plus everybody stuck in traffic behind me also sending their lawyers after me, I’d never drive.
Insurance solved that problem, both by eliminating the possibility of a catastrophic financial loss, and by creating a buffer between me and all those lawyers.
I predict that insurance will solve the lawyer problem for self-driving cars as well. At some point, it will cost me $5,000 a year to drive my own car, and $500 to let it do all the driving.
And on top of all that, if my car drives itself into an accident, the lawyers will talk to my insurance company, not to me. I see the insurance companies as the enabler for this tech. And they will want to enable it, it will put them in control of the market.
Indeed we can which is why some cities on the planet have gone carless and more and more opposition is mounting in the face of traffic deaths, pollution and so so on. Even in the US, maybe the most car dominated country in the world, there is a political revival of talks about high speed rail and alternative forms of transportation.
The other important difference being that the step from having no cars at all to having cars was one of the largest leaps in mobility in human history. Self-driving cars are nice, but not that much of a leap, and they have much more ambiguous implications when it comes to the job-market. They will face significantly larger hurdles with significantly less payoff in sight.
Pontevedra in Spain for example, Venice and Oslo are also to large degrees car-free in the city proper. Yes cars are banned from the city, here's an article about the consequences.
That works both ways, though. Once self-driving is proven, the higher accident rates from human-driving will become a greater liability. "Why were you driving yourself at night instead of engaging autopilot?!?"
You are correct; the public won’t accept the technology until it’s at least as safe statistically as air travel (and even then, there’ll be pushback in response to specific inevitable tragedies).
However, I think the profit incentives of the trucking industry will manage to carve out some regulatory exceptions; something like “freight trucks can self-drive between 11a and 5a on these specific Nebraska highways, with warning signs on both roads and vehicles”. This sort of lobbying will be the thin end of the wedge for both iterating the tech, and normalizing its acceptance.
Agreed, if a self driving car hits me and breaks my neck, who do I sue to pay my bills and care? The driver? The car manufacturer? Or the self driving software company? Right now with a driver its clear.
The world is not the USA. The legal system that exist in the US is different in other places of the world.
With a self driving car you know exactly what happened in an accident as video and telemetry is recorded. This is a tremendous advantage over having to reconstruct it without this data.
On the contrary I expect legislation forcing every car to include telemetry like the Chinese are forcing every car to be connected.
This accident's data is evidence, not opinion, not a belief, not a prejudice.
The usefulness of this has already been proven with airplanes.
How happy will it be when camera data in the age of hyper realistic cgi films and “foolproof” telemetry will protect us from killing robot car makers at court. :)
Most drivers on the road are effectively being subsidized by bankruptcy protection, because most cannot possibly cover the liability they are exposing themselves to by driving. This "subsidy" is far less valuable to a self driving car manufacturer than individual drivers.
I mean, sure, you can come up with some scenario where the liability exceeded the insurance coverage, but I haven't heard of many of those. Anyway, it comes from somewhere. If bankruptcy protects drivers, it also exposes them to the risk that they will suffer damage that isn't compensated.
Regardless, expecting a legal loophole to preserve the status quo indefinitely seems quite unrealistic and inherently unstable. If that actually holds up something that could massively benefit society (both economically and in saving lives), we simply legislate liability limits.
Liability you can incur while driving is almost arbitrarily high. Individual drivers rely on the existence of bankruptcy protection to cover these rare scenarios, or simply don't think about or plan for this at all.
> but I haven't heard of many of those.
How many do you think it takes to put a self driving car manufacturer out of business?
I'm not saying the status quo is a great situation, or that this is a good or bad argument for or against self driving cars. Only that it's a description of the current situation, and why legal issues might be a much bigger problem for self driving car manufacturers than individual drivers.
Because the deaths will be network level effects that a person would be helpless to mitigate through behavior. There's no sense of being a careful or responsible driver with a self-driving car; its all whether or not the algorithm or software is correct.
It's a different type of potential error and a much more scary one. I can mitigate human drivers as a pedestrian by taking care walking, but I cannot mitigate an AI that thinks my shirt makes me look invisible due to its learning being deceived by a pattern.
Because a jury will hand out a huge multi-million dollar award against a self-driving car company when its car crashes causing a death, while when a human causes a death it is usually just an "accident" or a much smaller settlement based on the person's insurance coverage.
Because the autonomous car is a systematic issue that could affect every single car on the road using that manufacturer's software, and the "driver" of the car has no way to stop or mitigate the risk, or cause risk. He puts his life in the company's hands each time he drives the vehicle.
I'm not sure why people even embrace self-driving cars. By now we know that centralization, lack of maintenance, and fragmentation in software are serious risks, as well as how software can increase the attack vector on people as well as provide benefit. I'm not sure if the risks are worth the modest efficiency increase, and this isn't even getting into existential risks like external parties being able to control when you drive, or attacks on the networks or technology.
Even if the death toll 1/100th ... the few dead will be paraded as examples. There will be outrage. There will be lawsuits. You are correct, it is not logical, but logic does not always prevail.