Not hoping to start any sort of flame war, but could someone try to explain this difference between streams and promises? The conceptual difference doesn't seem completely clear to me.
Promises hep deal with asynchronous functions by providing a convenient way to pass success/fail methods in a way some people prefer more than classic callbacks.
Streams are a totally different use case. Streams are used when you want to process data one part at a time without having to have all of the data in memory at once.
Promises are actually about trust, not syntax. With a callback you have to trust that the function that will invoke your callback (which might not be something you wrote) will only ever call it once, that it will pass through errors, that it won't call both your success and failure callbacks, etc.
With a promise (or rather, following the promise specification) you instead get back an object that you choose how to handle. That object is either pending or else an immutable success or failure result. Either success or failure will be called (not both), and whichever result is called can only be called once. As such, promises allow you to avoid inversion of control and to safely interact with potentially untrusted code.
You may be interested in going through "A General Theory of Reactivity"[0] from Kris Kowal, creator of the q promise library. It discusses relationships between streams, promises, iterators and generators (although the streams that he discusses are not how Node.js streams are implemented)
Cool, I'll have to take some time to read through more of that, but a quick skim actually made clear to me some of the differences I was missing, especially when combined with the other comments.
Which is unlike Promises, which guarantee a single "end," but also don't care when that ends (could have already happened, could happen eventually). Seems similar in nature to a socket connection.
> ... and if streams become a better and better option as response sizes get larger and larger. The streams syntax is nicer looking in any case.
fs.readFile(...) reads the entire contents of the file into memory. For small files that may be acceptable but if you're dealing with anything that could be large it's not going to be very pleasant or even work properly.
I just tried it out on my laptop for a 100MB and a 1GB file. For a 100MB file using streams is about 25% slower. However the sync method failed for the 1GB file:
In addition to the other responses, the pipe version will get initial chunks of data across the HTTP connection sooner, so that the user starts seeing data on their screen.
For example, can you use ffmpeg to pull live video via rtsp, re-encode and pipe to nodejs for it to stream down to browser and to be consumed by html - with or without a plugin?
it was, it's been posted several times -- at least I know for sure in comments. I was about to ask, "why are so many links getting re-posted in hackernews?"
Install via:
Now simply go forth and start your adventure by typing: Have fun and don't forget to thank the man:https://github.com/substack/stream-adventure
https://twitter.com/substack