I don't see why Tesla would deserve the benefit of the doubt here. We cannot know how well the actual Taxi software will work, I think it is fair to extrapolate from the parts we can observe.
re. extrapolation: I agree with that, but remember there's sampling error. The crashes/failures go viral but the lives saved get zero exposure or headlines. I don't think that means you can just ignore issues like this but I think it does mean it's sensible to try to augment the data point of this video with imagining the scenarios where the self driving car performs more safely than the average human driver
I absolutely do think that self-driving cars will save many lives in the long run. But I also think it is entirely fair to focus on the big, visible mistakes right now.
This is a major failure, failing to observe a stop sign and a parked school bus are critical mistakes. If you can't manage those you're not ready to be on the road without a safety driver yet. There was nothing particularly difficult about this situation, these are the basics you must handle reliably before we even get to alle the tricker situations those cars will encounter in the real world at scale.
I agree it's a major mistake + should get a lot of focus from the FSD team. I'm just unsure whether that directly translates to prohibiting a robotaxi rollout (I'm open to the possibility it should though).
I guess the thing I'm trying to reconcile is that even very safe drivers make critical mistakes extremely rarely, so the threshold at which FSD is safer than even the top 10% of human drivers likely includes some nonzero level of critical mistakes. Right now Tesla has several people mining FSD for any place it makes critical mistakes and these are well publicised so I think we get an inflated sense of their commonality. This is speculation, but if true it leaves some possibility of it being significantly safer than the median driver while still allowing for videos like this to proliferate.
I do wish Tesla released all stats for interventions/near misses/crashes so we could have a better and non-speculative discussion about this!
Caveat/preface to prevent trolls: FSD is a sham and money grab at best, death trap at worst, etc.
But, I've read through your chain of rplies to OP and maybe I can help with my POV.
OP is replying in good faith showing "this sampling incident is out of scope of production testing/cars for several reasons, all greatly skewing the testing from this known bad actor source."
And you reply with "Zero systemic reproducible mistakes is the only acceptable critera."
Well then, you should know, that is the current situation. In tesla testing, they achieve this. The "test" in this article, which the OP is pointing out, is not a standardized test via Tesla on current platforms. SO be careful with your ultimatums, or you might give the corporation a green light to say "look! we tested it!".
I am not a tesla fan. However, I also am aware that yesterday, thousands of people across the world where mowed down by human operators.
If I put out a test video showing that a human runs over another human with minimum circumstances met, IE; rain, distraction, tires, density, etc., would you call for a halt on all human driving? Of course not, you'd investigate the root cause, which is most of the time, distracted or impaired driving.
Tesla would like to replace that with FSD. (And make a boatload of money as a result)
My point is that we therefore can (and should!) hold Tesla to higher standards.
'Better than human' as a bar invites conflict of interest, because at some point Tesla is weighing {safety} vs {profit}, given that increasing safety costs money.
If we don't severely externally bias towards safety, then we reach a point where Tesla says 'We've reached parity with human, so we're not investing more money in fixing FSD glitches.'
And furthermore, without mandated reporting (of the kind Musk just got relaxed), we won't know at scale.
No! Ignoring a stop sign is such a basic driving standard that it's an automatic disqualification.
A driver that misses a stop sign would not have my kids in their car.
They could be the safest driver on the racetrack it does not matter at that point.
In the video (starting at ~13 seconds), the Tesla is at least 16 and probably 20 car lengths from the back of the bus with the bus red flashing lights on the entire time.
If the Tesla can't stop for the bus (not the kid) in 12 car lengths, that's not p-hacking, that's Tesla FSD being both unlawful and obviously unsafe.