Hacker News new | past | comments | ask | show | jobs | submit login

They should just shut down all their data centers and crawl the whole web from a single box located in someone's basement.

After all, the bot doesn't get impatient.




... Comments have really gone to shit here haven't they.

Some how we all end up antagonistic over bullshit like whether google have a big enough computer.

But alas, you're right, google could never crawl with an actual browser - what a ridiculous suggestion. I apoligise for such a dumb-witted comment.

As an aside: For my part in contributing such bad quality comments, I apoligise.


The point is that Google probably doesn't have a lot of cycles to spare - anything else wouldn't be good business sense.

Anything that significantly adds to the load will lose them money - whether or not the operation needs to be realtime is secondary to that.

I apologise for giving offense: I wrote the comment the same way I would have made it face-to-face, which is always a bit risky in a purely textual medium.


I don't know if you are trying to be serious at this point or not. Google has millions (literally) of machines with dozens of cores each. Search is their business that makes all the money.

Google executes JavaScript and renders the full DOM for every page internally. They generate full length screenshots of every page and have pointers to where text appears on the page so they can do highlighting of phrases within the screenshot.

It isn't even a debatable question if Google reuses the Chrome engine to do this.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: