Stream data is very common in Nodejs. There is a module called Stream which provides an API for implementing the stream interface. Streams make for quite a handy abstraction, and there's a lot you can do with them - as an example, let's take a look at stream.pipe(), the method used to take a readable stream and connect it to a writeable stream.
A very common use for stream.pipe() is file stream.
const fs = require("fs");
let readStream = fs.createReadStream("./myDataInput.txt");
let writeStream = fs.createWriteStream("./myDataOutput.txt");
readStream.pipe(writeStream);
Above is a simple example shows that we use pipe to transfer the data from the read stream to the write stream.
However there is a problem when using the standard source.pipe(destination). Source will not be destroyed if the destination emits close or an error. You are not able to provide a callback to tell when then pipe has finished.
To solve this problem, we can use pipeline which introduced in Nodejs 10.x or later version. If you are using Nodejs 8.x or earlier, you can use pump.
const { pipeline } = require('stream');
let readStream = fs.createReadStream("./myDataInput.txt");
let writeStream = fs.createWriteStream("./myDataOutput.txt");
pipeline(readStream, writeStream, error => {
if (error) {
console.error(error);
} else {
console.info("Pipeline Successful")
}
});
That's it~π
Thank you for reading
you may also read Backpressuring in Streams which explains more detail on why you should use pipeline.
Top comments (0)