It is hard to believe that any small startup can compete with Google anymore. Search is Google's ___domain and it will defend that turf to death. With more than 15 billion webpages indexed, it is no longer about superior technology alone but also the sheer mass of content available.
While I agree that it's very difficult to compete on generic search, it's certainly not hard to compete in niche domains. And you don't need anything like 15 billion pages for those.
However, you need the correct subset of the 15 billion pages, which is only fractionally easier, and in some ways harder: google can just grab everything, and then pull semantics out. If you want to leverage your limited ___domain, you need to be able to be able to have semantics in your indexer/crawler, otherwise you're going to end up having to index everything anyway.
One exception, of course, is in genuinely finite-___domain search engines, like Octopart. There, you know exactly where to send your indexer, so you can be very efficient.