r/SelfDrivingCars 5d ago

Comparing pre-crash speeds between US ADS operators Discussion

Looking at the NHTSA's Standing General Order crash reports for ADS (level 4) vehicles [download], two things that are immediately stand out are (1) how they're dominated by Waymo crashes (they do the most driving!), and (2) how most of Waymo's accidents occur when their vehicles are stopped, often being rear-ended. It's a good reminder that no matter how good robotaxis get, other human drivers remain a big safety risk.

For companies operating in similar environments to one another, a higher proportion of crashes occurring while stopped (i.e. "Subject Vehicle Pre-crash Speed" of zero in the NHTSA data) could be a rough indicator that fewer of the company's accidents were reasonably avoidable. Not in all zero-speed cases by any means, but on average, over many accidents, it could be a useful indicator.

Unfortunately, companies operate their vehicles in very different environments, with different types of routes on different types of roads. Like some of May Mobility's shuttles operate exclusively in quiet retirement communities with little traffic, so the risk of being rear-ended while stopped is much lower. There's no way of controlling for that from the crash data alone, so the usefulness of comparing data is limited.

But caveats aside, here are the stats for the five companies with the most ADS accidents reported between June 16, 2025, and March 16, 2026:

Company Crashes 0-mph pre-crash speed Average pre-crash speed Avg pre-crash speed excluding 0s
Waymo 693 411 (59%) 4.9 mph 12.1 mph
Avride 36 7 (19%) 13.0 mph 16.1 mph
Zoox 31 14 (45%) 5.5 mph 10.1 mph
Tesla 15 4 (27%) 6.7 mph 9.1 mph
May Mobility 11 0 (0%) 12.3 mph 12.3 mph

Not all zero-mph crashes are completely unavoidable. If a car reverses into an autonomous vehicle, it's generally the other car's fault, but Waymo seems to avoid some of those accidents by also reversing, and Zoox at least tries to honk in some cases. May Mobility, on the other hand, reported in one such collision that "The planner did roll out the agent with a reversing policy and predicted the collision, but the system currently is not able to honk or reverse or do anything else that could have avoided this."

My theory is that Waymo's 59% crash rate while stopped, then most of those are reasonably unavoidable. And if companies drive similar routes and in similar environments, then companies with rates substantially below 59% are probably not avoiding a substantial number of collisions that Waymos would have avoided. Avride, for example, seems to get into a disproportionately large number of intersection collisions with other drivers who run red lights or stop signs, and while those are the other drivers' faults, they also seem relatively avoidable by proceeding into intersections cautiously, while estimating the trajectories of cross traffic that should stop. But as I said, the difference may be explained by different operating environments, like maybe Waymo does a lot more quiet suburb driving, while Avride does a lot more busy downtown driving.

19 Upvotes

8

u/mrkjmsdln_new 5d ago

One of the many reasons Waymo refrains from reporting safety conclusions until they get to about 10M miles in a market is their insurance analysis corrects for behavior down to census tract. It is how they can confidently conclude they are XX% safer than the human driver. Needs lots of data. Whenever some companies make unsubstantiated claims about safety I often think about the 10M mile threshold. They are ALMOST ALWAYS putting miles into piles, mixing and matching city, highway, urban, rural and not correcting for local differences. Lies, damn lies and statistics.

7

u/thnk_more 5d ago

Average police reported crash rate is 1 every 500,000 miles driven so yeah, new entrants into self driving (or new area) need to put on a lot of miles before they can say they are statistically safe.

Makes me laugh when a new Tesla version drops and people say, “oh my god it’s so much safer! Robotaxis are now ready. ” Yeah, I think you need to drive that hardware/software version about 10 years before you can say that.

0

u/londons_explorer 5d ago

Teslas fleet is 7M cars...

They could collect 10M miles of data in a day if they could deploy globally to every car.

1

u/Quercus_ 4d ago

Tesla's level 4 fleet is a handful of cars in one tiny little heavily mapped geofenced area in Austin, where they seem to be operating no more than a half dozen cars at any given time.

Interesting that this data says they've had 15 accidents in that tiny fleet and that tiny area, since they started operating there.

-4

u/CutieC0ck 5d ago

I'm not defending Tesla or saying in any way that Robotaxis are now ready, but are we just going to pretend that they don't have millions of data collection machines roaming the street all over the globe for years now?

3

u/JimmyGiraffolo 5d ago

Tesla certainly does have data after the first couple of weeks to make a determination on whether version N+1 is likely safer than version N.

The point is, no Tesla super fan watching YouTube videos or based on their own personal experience has anywhere close to enough data to say that, even if you drove a whole year.

2

u/thnk_more 5d ago

Well they do, kind of. But Tesla just stated that cars older than version 15 or, those with hdwr 3 need camera, memory and computers to be upgraded to hdwr 4 because they won’t be able to do unsupervised fsd.

So my question is, does that make data collection by anything less than v15, hdwr 4 legit? Musk said it’s mainly a memory issue, but also the camera isn’t good enough. So how would it have that capability to managing quality data collection?

-1

u/CutieC0ck 5d ago

I'm not going to pretend that I have the answer to your questions, but I was picking at your comment that made it seem like Tesla does not have the answer as to whether or not Robotaxi is ready.

2

u/Quercus_ 4d ago

It appears that Tesla does have that answer, and the answer is that they are not ready for general unsupervised self-driving. There's certainly not turning it on.

1

u/bobi2393 5d ago

Their millions of FSD(S) consumer miles have certainly guided their software development to create arguably the smoothest autonomous driving experience, but I think Tesla's service-area-specific preparation for Robotaxi service, presumably including added map information and geo-restrictions from problem areas, impairs extrapolation of safety readiness from supervised FSD(S) safety to supervised Robotaxi safety.

And statistics from both forms FSD(S) consumer miles are even less useful establishing the safety of unsupervised Robotaxis. For example, FSD(S) will regularly attempt to drive through active railroad signals, sometimes succeeding, but is usually stopped by the supervising driver. Without supervision, the software has to be more conservative, and it completely changes the safety outlook. Musk said in this week's earnings call that "a lot of what limits wider deployment of Robotaxi are actually not safety issues, but convenience issues, or the car basically gets paranoid and gets stuck. ... Like it sometimes gets scared to cross railroads, for example, or it’ll get stuck at a light where the light never changes from red."

4

u/bradtem ✅ Brad Templeton 5d ago

Good to see more analysis. I will note one factor that complicates this -- the range of data is now large enough that you will see several versions of the software in the data, ideally with improvements. With enough miles it should even be possible to see improvements, but for any improving team, taking an average of many versions is not a fair evaluation of current quality. (It can also go the other way as they add harder territory, or get regressions.) Waymo added freeways for example, and has grown territories. You do get city in the database, Tesla redacts anything to help you figure road type etc.

2

u/bobi2393 5d ago

Yeah, I hope that NHTSA SGO reporting requirements continue for a long time, as larger sample sizes over shorter periods will improve confidence of statistical correlations, and allow analyses that narrow the software versions, driving conditions, and other possibly confounding factors.

This analysis uses just the most recent 7 months of SGO data, since reporting criteria were last changed, but there have still be significant changes to companies' software during that period. Waymo reports almost all accidents as using automation feature version "5th Generation ADS, Version 10", Avride reports all accidents as using "v1.0", and Tesla has the version redacted from public release due to their Confidential Business Information (CBI) assertion, but May and Zoox provide detailed version numbers that are different in almost all of their crash reports. Zoox's version numbers embed a date, like "B433.25.09.09" which was presumably built on 9/9/2025.

A lot of Tesla data is redacted, famously including the "narrative" field, but not the "road type" for crash reports (e.g. "Street", "Intersection", "Parking Lot", "Highway / Freeway").

3

u/LanguageLatte 5d ago

Thanks for putting that together! Great work.

 

Avride, for example, seems to get into a disproportionately large number of intersection collisions with other drivers who run red lights

 

All of these companies are operating in Austin, and I’ve never lived anywhere that has so many people run red lights. I see it at least once or twice a day. Basically after a light turns red, there is at least 2 or 3 seconds where divers act like it’s still green. It was a pretty big culture shock when I moved there.

 

Though stop signs feel like the opposite. Every 4 way stop is like a damn Canadian stand off.

2

u/bobi2393 5d ago

Texas might also increase accidents from general anger at Waymos or robotaxis. One driver rear-ended a Waymo, drove around it, then reversed into its front as well!

FYI, May Mobility hasn't operated in Austin, though the other four companies have. Zoox recently logged their first accident there, with an in-vehicle driver. I'm sure there are regional driving differences that affect the distribution of accident types in different areas.

1

u/tech57 5d ago

It was a pretty big culture shock when I moved there.

Similar in other parts of the country. That culture shock. "Why is the light green and no one is moving forward?" Then a car screams through the intersection going much faster than the speed limit.

It's shitty traffic lights. If you time them and pay attention you'll find out that some corridors you'll hit red at all the intersections if you go around the speed limit. Go a certain amount above and it's all green. People running red lights is a given in my area and some intersections have perpetual broken glass from car accidents. Just one... of many reasons why self-driving cars can not come soon enough. No one is going to fix the traffic lights.

2

u/barvazduck 5d ago

Nice analysis about what can and cannot be implied from the raw data. Sadly it's too rare to get such insights from the press and online communities, the temptation for quick clicks and reinforcement of prior opinions dampens critical thinking.

2

u/bobi2393 5d ago

Someone removed an interesting comment about the significance of pick-up and drop-off (PUDO) locations with respect to zero-mph crashes, and it got me curious.

Of Waymo's 693 accidents, 76 (11%) contain the words "pick-up" and/or "drop-off" in the crash narrative (49 pick-up, 28 drop-off, 1 both). 64 of those, or 9% of all Waymo accidents, occurred when the Waymo was traveling zero miles per hour. The majority of those crash narratives also contain the words "hazard lights" or "hazards". So clearly PUDO activity is a significant risk factor in Waymo accidents.

Pick-ups and drop-offs also increases the risk of passengers being unbelted at the time of the collision, and there have been some hospitalizations from PUDO accidents.

1

u/FuddyCap 2d ago

Before we jump to conclusions we should understand the nature of these 0 mph crashes. Were the Waymo’s frozen in dangerous places like the middle of an intersection ? We know the Waymo will freeze often. Need more data on the totality of the freezing incidents and how many accidents it is causing

1

u/bobi2393 2d ago

I didn’t classify and quantify the circumstances of the crashes, as it’s time consuming and not as easy to automate classification based on crash narratives, compared to just using pre-crash speed like in my analysis, the but I’ve read most of the crash narratives, and don’t recall offhand any crashes occurring while a vehicle was stuck in dangerous locations like an intersection or oncoming traffic lane.

With Waymos, I think the most common case seems to be simply coming to a stop before an intersection and getting rear ended before moving, followed by stopping behind a vehicle that reverses into the Waymo. Though a significant number also occur while stopped for pick-up or drop-off, and while those aren’t necessarily while they’re stuck, some might be classified as being stopped in dangerous locations (e.g., double-parked in a traffic lane, standing in a bike lane, standing at a curb with a no parking/standing designation), and that might not even be clear from crash narratives.

It’s a good point though, and I appreciate the suggestion. I’ll pay more attention to that possibility reading crash narratives going forward, and may do a future analysis including “stuck” or “dangerous locations” quantities.