Hacker News new | past | comments | ask | show | jobs | submit login

This is a great write-up on streams. For curiosity sake, I'd be interested to know the overhead of

    var stream = fs.createReadStream(__dirname + '/data.txt');
    stream.pipe(res);
vs. fs.readFile('./data.txt) and res().

and if streams become a better and better option as response sizes get larger and larger. The streams syntax is nicer looking in any case.




> ... and if streams become a better and better option as response sizes get larger and larger. The streams syntax is nicer looking in any case.

fs.readFile(...) reads the entire contents of the file into memory. For small files that may be acceptable but if you're dealing with anything that could be large it's not going to be very pleasant or even work properly.

I just tried it out on my laptop for a 100MB and a 1GB file. For a 100MB file using streams is about 25% slower. However the sync method failed for the 1GB file:

    Style            100MB       1GB
    =====            =====       ===
    fs.readFileSync  .176s    <failed>
    streams/pipe     .234s      1.25s


By adapting the highWaterMark option one could probably tweak it even further.

Default buffer size seems to be 64k: https://github.com/joyent/node/blob/912b5e05811fd24f09f9d652...

Stream buffering: http://www.nodejs.org/api/stream.html#stream_buffering


In addition to the other responses, the pipe version will get initial chunks of data across the HTTP connection sooner, so that the user starts seeing data on their screen.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: