> Pedestrian' in this context seems pretty misleading
What's misleading? The full quote:
"A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene."
If you exit a vehicle, and are on foot, you are a pedestrian.
I wouldn't expect FSD's object recognition system to treat a human who has just exited a car differently than a human walking across a crosswalk. A human on foot is a human on foot.
However, from the sound of it, the object recognition system didn't even see the 4Runner, much less a person, so perhaps there's a more fundamental problem with it?
Perhaps this is something that lidar or radar, if the car had them, would have helped the OR system to see.
The description has me wondering if this was definitely a case where FSD was being used. There have been other cases in the past where drivers had an accident and claimed they were using autopilot when they actually were not.
I don't know for sure, but I would think that the car could detect a collision. I also don't know for sure, but I would think that FSD would stop once a collision has been detected.
> There have been other cases in the past where drivers had an accident and claimed they were using autopilot when they actually were not.
If that were the case here, there wouldn't be a government probe, right? It would be a normal "multi car pileup with a fatality" and added to statistics.
With the strong incentive on the part of both the driver and Tesla to lie about this, there should strong regulations around event data recorders [1] for self driving systems, and huge penalties for violating those. A search across that site doesn't return a hit for the word "retention" but it's gotta be expressed in some way there.
If it hit the vehicle and then hit one of the people who had exited the vehicle with enough force for it to result in a fatality, it sounds like it might not have applied any braking.
Of course, that depends on the speed it was traveling at to begin with.
What's misleading? The full quote:
"A red Tesla Model Y then hit the 4Runner and one of the people who exited from it. A 71-year-old woman from Mesa, Arizona, was pronounced dead at the scene."
If you exit a vehicle, and are on foot, you are a pedestrian.
I wouldn't expect FSD's object recognition system to treat a human who has just exited a car differently than a human walking across a crosswalk. A human on foot is a human on foot.
However, from the sound of it, the object recognition system didn't even see the 4Runner, much less a person, so perhaps there's a more fundamental problem with it?
Perhaps this is something that lidar or radar, if the car had them, would have helped the OR system to see.