Assuming you trust Waymo's account, the article details them, saying the following:
>So that’s a total of 34 crashes. I don’t want to make categorical statements about these crashes because in most cases I only have Waymo’s side of the story. But it doesn’t seem like Waymo was at fault in any of them.
Back when I used to pay attention to this stuff, the vast majority of crashes occurred when the Waymo vehicle was rear-ended while stopped at a traffic light.
Considering that there's a >1000:1 ratio of regular cars to Waymo AVs - Waymo would have to be EXTREMELY terrible at driving to move the numbers for the other group meaningfully - which would show up in Waymo's own crash data.
There's also historical data. So if you saw a spike in crashes for regular vehicles after Waymo arrives, it would be sus. But there is no such spike. Further evidence Waymo isn't causing problems for non AVs.
Of course anything is possible. But it's unlikely.
I'm confused by your comment. We shouldn't expect that Waymo accidents should budge overall accidents (which seems to be what you are talking about), but it wouldn't be crazy for Waymo, even if it was much safer overall, to be responsible for some non-trivial amount of all the accidents it has.
For example, imagine that Waymo is (somehow) far far far superhuman in it's ability to avoid other cars doing dumb/bad things. It has a dramatic reduction in overall accidents because it magically can completely get rid of accidents where the other driver is at fault. But, in some very specific circumstances, it can't figure out the proper rate to slow down at intersections, and it consistently rear ends vehicles in front of it. This specific situation is very rare, so overall accidents still are low (much lower than human drivers), but, in our made up, constructed (and extremely non-sensical) hypothetical, nearly 100% of Waymo accidents are Waymos fault.
So I don't think it's ridiculous to ask how many of the accidents Waymo has been involved in are the fault of the Waymo vehicle. It turns out that (assuming Waymo's side of the story is to be trusted), almost none of them are their fault, but it didn't have to be that way, even in the case where Waymo accidents were more rare than human accidents.
>For example, imagine that Waymo is (somehow) far far far superhuman in it's ability to avoid other cars doing dumb/bad things. It has a dramatic reduction in overall accidents because it magically can completely get rid of accidents where the other driver is at fault. But, in some very specific circumstances, it can't figure out the proper rate to slow down at intersections, and it consistently rear ends vehicles in front of it. This specific situation is very rare, so overall accidents still are low (much lower than human drivers), but, in our made up, constructed (and extremely non-sensical) hypothetical, nearly 100% of Waymo accidents are Waymos fault.
My understanding is that the reverse is basically what happens in reality. Humans can sense that Waymo cars are "sus" and give them a wide berth so their "lawful to the point of violating the expectations of other drivers" behavior mostly doesn't cause problems, but when it does the other guy pays.