Hacker News new | past | comments | ask | show | jobs | submit login

It's extremely difficult to make that transparent, though. At some point, old data is going to have to be pushed out of that cache, and if you don't make it clear to users they're never going to know what they can and can't run.



It's extremely difficult to make that transparent, though.

Actually, that transparency is exactly what will make or break this scheme. If know you can't make it transparent, then you shouldn't try. But the companies that succeed (probably Apple and Google) would make billions.

At some point, old data is going to have to be pushed out of that cache, and if you don't make it clear to users they're never going to know what they can and can't run.

That's why it's a >fat< cache, as opposed to a thin client. If you have a big enough cache, then it won't happen too often. It doesn't work absolutely all the time, but if it works almost all of the time, customers will be happy.

And if it doesn't work, and your company also makes the hardware, then when the customer comes to the store, you tell them they need to buy a machine with more memory. Cha-ching.


>"It doesn't work absolutely all the time, but if it works almost all of the time, customers will be happy."

Bullshit. When stuff doesn't work people are often unhappy as hell. And a computer that just deletes old stuff, it will really piss them off.


And a computer that just deletes old stuff

You've just demonstrated you don't understand what's being discusses here.


A link might enlighten me. Until then, my understanding of caches is that they get flushed and my understanding of the cloud will be that its storage services are often not very long lived and that the rates for storage are subject to change.


A link might enlighten me.

I don't think so. You seem pretty attached to the idea of the behavior of CPU caches, while steadfastly refusing to extend your own metaphor to persistent storage. Odd.


Persistent storage to which access may be significantly delayed on a human time frame probably tends to loose utility rapidly.

And if one has a Terabyte of data, how is putting it in the cloud and accessing it at web speeds better than a hard disk at bus speeds?

With a processor cache, populating the cache predicatively is far easier due to the limited ___domain of alternatives, the logical structure of instructions, and trillions of cycles per core per hour available for testing alternative predictive schemes.

On the other hand, a user may ask for a rarely requested file once every ten to thirty fortnights, or never. And predicting that request would require parsing a joke told on WJMZ's morning show fourteen minutes ago.


I don't see him mentioning CPU caches anywhere.

The point is that a cache, even if it is a fat cache, will remove files eventually. Files the user may be expecting to find.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: