I think you’re omitting the Xeon’s increased stability versus consumer grade architectures. Many Mac Pros are used to run non-stop, they need to be more dependable than other computers.
Does this really matter in 2013? Serious question. I could see ECC having been useful in 2000 in the midst of the megahertz race, but I'm not too convinced about today...
The probability of a bit error on normal RAM is quite high [1]. The more RAM you add the more likely you are to see corruption. Not that ECC fixes everything - I still see uncorrected ECC errors on the older HPC nodes on our grid.