The description has me wondering if this was definitely a case where FSD was being used. There have been other cases in the past where drivers had an accident and claimed they were using autopilot when they actually were not.
I don't know for sure, but I would think that the car could detect a collision. I also don't know for sure, but I would think that FSD would stop once a collision has been detected.
> There have been other cases in the past where drivers had an accident and claimed they were using autopilot when they actually were not.
If that were the case here, there wouldn't be a government probe, right? It would be a normal "multi car pileup with a fatality" and added to statistics.
With the strong incentive on the part of both the driver and Tesla to lie about this, there should strong regulations around event data recorders [1] for self driving systems, and huge penalties for violating those. A search across that site doesn't return a hit for the word "retention" but it's gotta be expressed in some way there.
If it hit the vehicle and then hit one of the people who had exited the vehicle with enough force for it to result in a fatality, it sounds like it might not have applied any braking.
Of course, that depends on the speed it was traveling at to begin with.
I don't know for sure, but I would think that the car could detect a collision. I also don't know for sure, but I would think that FSD would stop once a collision has been detected.