Between anti-Musk sentiment, competition in self driving and the proven track record of Lidar, I think we’ll start seeing jurisdictions from Europe to New York and California banning camera-only self-driving beyond Level 3.
Nah, you don't need to ban anything. Just force the rule, that if company sells self driving, they are also taking full liability for any damages of this system.
It’s not a conundrum as much as an implementation detail. We’ve decided to hold Waymo accountable. We’re just ticking the boxes around doing that (none of which involve confusion around Waymo being responsible).
Personally, I'd argue that if the AI killed someone due to being incompetent (as in, a human in a fit state to drive would not have made this mistake), the punishment should go to the corporation that signed off on the AI passing all relevant tests.
The nature of the punishment does not necessarily follow the same rules as for human incompetence, e.g. if the error occurs due to some surprising combination of circumstances that no reasonable tester would have thought to test, which I can't really give an example of because anything I can think of is absolutely something a reasonable tester would have thought to test, but for the sake of talking about it without taking this too seriously consider if a celebrity is crossing a road while a large poster of their own face is right behind them.
Let me re-iterate my original caution: human drivers are really bad: more than 40,000 people die in car crashes every year! If a self driving cars makes mistakes that humans would not in some cases, but overall they would only cause 30,000 deaths per year then I want self driving required. Thus I want liability to reflect not perfection is required but that they are better than humans.
Don't get me wrong, perfection should be the long term goal. However I will settle for less than perfection today so long as it is better.
Though better is itself hard to figure out - drunk (or otherwise impaired drivers) are a significant factor in car deaths, as is bad weather when self driving currently doesn't operate at all. Statistics do need to make sure self driving cars are better than non-impaired drivers in all situations where humans driver before they can claim better. (I know some data is collected, but so far I haven't seen any independent analysis. The potentially biased analysis looks good though - but again it is missing all weather conditions)
The AI's benefits should be irrefutable, but this isn't as simple as "at least x10 better than human drivers", or any fixed factor, it's that whatever mistakes they do make, if you show the video of a crash to the general public, the public generally agrees they'd have also crashed under those conditions.
Right now… Tesla likes to show off stats that suggest accidents go down while their software is active, but then we see videos like this, and go "no sane human would ever do this", and it does not make people feel comfortable with the tech: https://electrek.co/2025/05/23/tesla-full-self-driving-veers...
Every single way the human vision system fails, if an AI also makes that mistake, it won't get blamed for it. If it solves every single one of those perception errors we're vulnerable to (what colour is that dress, is that a duck or a rabbit, is that an old woman close up facing us or a young woman from a distance looking away from us, etc.) but also brings in a few new failure modes we don't have, it won't get trusted.
By this logic, then we should also create a rule for regular, non-self-driving that says, if you have a car accident that kills someone, all your wealth is taken away and given to the victim's family. If we had a rule like this, then "you'd probably see much safer driving". Are you willing to drive under those circumstances? I am sure you will say yes, but it does not make your suggestion any less ridiculous.
I can think of one example where something similar works. The requirements from insurance companies on airline pilots are considerable tougher than the government ones because they are on the hook for ~$200m if they crash.
A big reason car companies don't worry much about killing pedestrians at the moment is it costs them ~$0.
About half our road fatalities are pedestrians. About 80% of those are intoxicated with alcohol. When you're driving at 40mph, at night, and some drunk guy chooses to cross the road, no amount of safety features or liabilities can save him.
Sure, cars can be safer for light collisions with pedestrians where the car is going slowly. Especially in the US where half the cars have a very high hood. But where I live the problem is not safer cars, it's drunk pedestrians.
I wonder how a Waymo would do with your drunks? Really the answer for that is probably more a different road layout so the drinking is separate from the traffic. I live near Soho in London which is full of drunk people in the streets but most traffic is blocked off there or doing 10 mph.
I’ve been paying more attention to Waymos recently.. and noting that it stops to let people cross that i didn’t even see first.
And sometimes at places that aren’t even a cross walk.
Im in DTLA frequently and I am almost even developing a secondary instinct to cover my brake and have an extra look around when a Waymo stops in a street.
Because it may be dropping off or picking up a rider or it saw something or someone I didn’t. Just happened Saturday in fact. I saw it do an abrupt stop when I was yielding to it at a “T” intersection and expected it to have the right of way and keep going. I didn’t proceed until I could figure out WHY it had just stopped, like “okay WHERE’S the passenger”
and then five or so people started running across the street in front of it that I would not have seen if that Waymo wasn’t there and I was clear to turn left.
As an added bonus it stayed stopped after they all crossed and I decided to be a jerk and turn left in front of it. It stayed stopped for me too. There’s no driver in it. It ain’t mad. XD
I have a good eye for spotting uber drivers who are about to load or unload too,
Especially if they have some common sense and are trying to line up to do that so their passenger can get on or off curbside. A Waymo is just.. way more immediately identifiable that I can react that much faster to it or just be like.. alright. I’ll take a cue from it, it’s usually right.
And hell even if it’s wrong, maybe this isn’t a good time to pull out in front of it anyway!
We cannot even properly ban asbestos, expecting people to die first is just having a realistic perspective on how the US government works WRT regulations.
That's a legal non-starter for all car companies. They would be made liable for every car incident where self-driving vehicles were spotted in close vicinity, independently of the suit being legit. A complete nightmare and totally unrelated to the tech. Makes would spend more time and tech clearing their asses in court than building safe cars.