Hacker News new | past | comments | ask | show | jobs | submit login
Which loads faster? Pit websites against each other. (whichloadsfaster.com)
43 points by mikexstudios on July 9, 2010 | hide | past | favorite | 25 comments



Here's the associated blog post from the developer: http://onecreativeblog.com/post/781952553/announcing-whichlo...


http://google.com 2.7 × faster (1198 ms / 3191 ms) than https://google.com


Pretty much the same with google.de/google.de (36%), my blog (52%) and xkcd.com (44%). This killed it for me actually. Another observation is that the right window is always faster at my end. Can anybody else confirm this?


Dunno if this holds up for others, but Bing is substantially faster than Google's homepage to load, even with all of the crazy graphics... Impressive.


Due to the distribution of servers, and the way data is routed about the internet, i would imagine it would be different for everyone.


I am only measuring the "onDOMContentLoaded" or jQuery's cross-browser equivalent. It's certainly possible for pages to load objects after that using javascript, which "cheats" a little for purposes of this comparison.

What really matters in practice is "time to first interaction" which is when the user thinks your site is useful. Next up on my feature list is having a button to indicate which pages seemed to load faster to you. The idea is you can send a link out to your friends and get their impressions, which is more valuable than the numbers you can get with an automated tool. :)


Fwiw, Google loaded 20% faster for me.


Bing's homepage consistently loads faster for me too. But Google's search results tend to load faster.


looks interesting, but serial should be the default mode.


You might be right. I considered that, but it doesn't have as much of an impact factor on the user, which is really important for getting the word out about performance (why the site exists in the first place).

I probably doesn't matter too much as long as you have a ~3Mbps connection or higher (see Mike Belshe's "Effective Bandwidth of HTTP" graph on http://www.belshe.com/2010/05/24/more-bandwidth-doesnt-matte...) because the TCP transmission rate can't ramp up high enough on each object to saturate the connection. It's possible, however that the objects will interact with each other by causing differing queueing latency and possibly loss in the bottleneck router and by changing the order of scheduler events on the client OS. Also, it may be better subject the two pages to the same network conditions (at least in the last mile) while they're loading, esp. on mobile.


This uses iframes to pit websites against each other. Twitter breaks out of iframes and so do other websites.


Any idea how to prevent this from happening? Google seems to have figured it out in their google images iframes, but I can't for the life of me figure out how they did it.


It looks like google parse the page serverside to detect break out of frames scripts. Here's a manually engineered example - I searched for a demo of a break out of frames script and forced the URL into a google images request:

http://images.google.com/imgres?imgurl=http://www.internet.c...

THe page just displays a "you are being redirected" message straight away, so it must be detected at the server level. Smart.


Having said that, if you look at a genuine result from twitter:

http://images.google.com/imgres?imgurl=http://a1.twimg.com/p...

google doesn't detect or stop that breaking out of frames. But maybe twitter are just managing to avoid detection by google's code..

The code on that page is:

    	<script type="text/javascript">
	//<![CDATA[
	if (window.top !== window.self) {document.write = "";window.top.___location = window.self.___location; setTimeout(function(){document.body.innerHTML='';},1);window.self.onload=function(evt){document.body.innerHTML='';};}
	//]]>
	</script>

which doesn't seem like it would be too hard to detect.


The brute-force way to do it is to run a proxy on your own server that loads the page in question, strips out either some or all of the javascript, then serves it onward.

Replacing 'top.___location' with 'self.___location' probably does the trick for 95% of sites.


Ah, but that might skew the results... maybe not too much, though since it's only manipulating the root document? However, cross-___domain permission would be broken for javascript inside the site.

If you look at the bottom of ui.js, you can see I tried a brute-force framebuster-buster. It does notify the user that the site is trying to break out, but it also catches legitimate outgoing links. It needs to be polished more before I can release it.

This is issue #1 on github for the project, BTW.


Bing is faster on mine (FF)

On others it's google that is faster (They use Chrome)

So browsers seems to affect it too.


Here's one I didn't expect: HN loads 5.5x faster than reddit.


Yeah, I would have put HN at least 10x faster. ;)


My results for HN vs reddit:

Average over 10 runs: tie

HN: 1681ms / reddit: 1652ms


Very cool idea. Could play with this for hours.


neatly done and pretty useful.


The reporting and accuracy is wrong.

I clearly see Bing loading images AFTER the timer has stopped.

This is just timing the main html, it's wrong.


I believe Bing loads images asynchronously after the DOM content has loaded.


very cool!




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: