Pretty much the same with google.de/google.de (36%), my blog (52%) and xkcd.com (44%). This killed it for me actually.
Another observation is that the right window is always faster at my end. Can anybody else confirm this?
I am only measuring the "onDOMContentLoaded" or jQuery's cross-browser equivalent. It's certainly possible for pages to load objects after that using javascript, which "cheats" a little for purposes of this comparison.
What really matters in practice is "time to first interaction" which is when the user thinks your site is useful. Next up on my feature list is having a button to indicate which pages seemed to load faster to you. The idea is you can send a link out to your friends and get their impressions, which is more valuable than the numbers you can get with an automated tool. :)
You might be right. I considered that, but it doesn't have as much of an impact factor on the user, which is really important for getting the word out about performance (why the site exists in the first place).
I probably doesn't matter too much as long as you have a ~3Mbps connection or higher (see Mike Belshe's "Effective Bandwidth of HTTP" graph on http://www.belshe.com/2010/05/24/more-bandwidth-doesnt-matte...) because the TCP transmission rate can't ramp up high enough on each object to saturate the connection. It's possible, however that the objects will interact with each other by causing differing queueing latency and possibly loss in the bottleneck router and by changing the order of scheduler events on the client OS. Also, it may be better subject the two pages to the same network conditions (at least in the last mile) while they're loading, esp. on mobile.
Any idea how to prevent this from happening? Google seems to have figured it out in their google images iframes, but I can't for the life of me figure out how they did it.
It looks like google parse the page serverside to detect break out of frames scripts. Here's a manually engineered example - I searched for a demo of a break out of frames script and forced the URL into a google images request:
The brute-force way to do it is to run a proxy on your own server that loads the page in question, strips out either some or all of the javascript, then serves it onward.
Replacing 'top.___location' with 'self.___location' probably does the trick for 95% of sites.
Ah, but that might skew the results... maybe not too much, though since it's only manipulating the root document? However, cross-___domain permission would be broken for javascript inside the site.
If you look at the bottom of ui.js, you can see I tried a brute-force framebuster-buster. It does notify the user that the site is trying to break out, but it also catches legitimate outgoing links. It needs to be polished more before I can release it.