Streams

Node.js is asynchronous and event driven in nature. As a result, it’s very good at handling I/O bound tasks. If you are working on an app that performs I/O operations, you can take advantage of the streams available in Node.js. So, let’s explore Streams in detail and understand how they can simplify I/O.

What are exacty streams?

Streams are collections of data — just like arrays or strings. The difference is that streams might not be available all at once, and they don’t have to fit in memory. This makes streams powerful when working with large amounts of data, or data that’s coming from an external source one chunk at a time.

There are couple of operations that can be performed with streams:

There are couple of types of streams: Readable, writable, duplex.

Readable Streams

  1. Reading from streams

  2. Setting Encoding

  3. Piping

  4. Chaining

Assume that you have an archive and want to decompress it. There are a many ways to achieve this. But the easiest and cleanest way is to use piping and chaining. Have a look at the following snippet:

var fs = require('fs');

var zlib = require('zlib');

fs.createReadStream('input.txt.gz')

.pipe(zlib.createGunzip())

.pipe(fs.createWriteStream('output.txt'));

Writable Streams

results matching ""

    No results matching ""