I have the same experience. I think if there was more transparency in the data around FSD related accidents the conversation would be different. The last evidence we have is Tesla stating that there have been no fatalities in 60 million miles of driving under FSD. Pretty good so far.
There is no such evidence beyond unsupported proclamations by the vendor who, by the way, has compulsively misrepresented the capabilities of the product for years. The only evidence available that is not completely tainted by a complete conflict of interest is basically Youtube videos and self reported information [1] by investors and super fans that on average show critical driving errors every few minutes and even those are tainted by self interest to over-represent the product.
Tesla’s statements should only be believed if they stop deliberately hiding the data from the CA DMV by declaring that FSD does not technically count as a autonomous driving system and is thus not subject to the mandatory reporting requirements for autonomous systems in development. If they did that then there could actually be a sound independent third party statements about their systems. Until that time their claims should be as trusted as Ford’s were on the Pinto.
Absolutely need more data, but if a public company makes untruthful claims about the past they are in big trouble. They are mostly free to make claims about the future without much consequence.
This is why startups are allowed to project massive sales next year but Elizabeth Holmes is headed to jail.
If it turns out that FSD has had a fatal crash and Tesla lied about it Musk is headed to jail too.
We don’t live in a “just world”. There’s a reason a lower tier person like Holmes gets some prison and psychos like, say, Trump or Musk barely get a slap on the wrist.
For Tesla's collection of data on accidents with Autopilot technologies, they considers Autopilot to have been enabled during the crash if it was on several seconds beforehand. These systems may hand control back to the driver in crash scenarios moments before impact, but that doesn't mean the data will report that the system was not enabled for the incident.
Yeah, let's now hear the stats on "fatal accidents within 30s of FSD being enabled".
You really just can't trust Tesla at all, about anything. There's no integrity there. They won't even be honest with the name of their headline feature!
Regardless of whether or not that's true, that's also not the impressive number you think it is. Human-driven cars have a fatality rate of about 1 per 100 million miles travelled, and that's including drunk drivers, inattentive drivers, and all environments, not just the ones people are comfortable turning on these features in.