It’s about time to embrace Node.js Streams

 

Luciano Mammino (@loige)

MEETUP

Austin, TX
August 22nd

// buffer-copy.js

const {
  readFileSync,
  writeFileSync
} = require('fs')

const [,, src, dest] = process.argv

// read entire file content
const content = readFileSync(src)

// write that content somewhere else
writeFileSync(dest, content)

We do this all the time

and it's ok

but sometimes ...

💥 ERR_FS_FILE_TOO_LARGE! 💥

File size is greater than possible Buffer

But why?

if bytes were blocks...

Mario can lift

few blocks

but not too many...

?!

What can we do if we have to move many blocks?

We CAN move them one by one!

we stream them...

👋 Hello, i am Luciano!

🇮🇹

🇮🇪

🇺🇸

Cloud Architect

Blog: loige.co

Twitter: @loige

GitHub: @lmammino 

01. Buffers VS
        Streams

buffer: data structure to store and transfer arbitrary binary data

*Note that this is loading all the content of the file in memory

*

Stream: Abstract interface for working with streaming data

*It does not load all the data straight away

*

File copy: The buffer way

// buffer-copy.js

const {
  readFileSync,
  writeFileSync
} = require('fs')

const [,, src, dest] = process.argv
const content = readFileSync(src)
writeFileSync(dest, content)

FILE COPY: The Stream way

// stream-copy.js

const { 
  createReadStream,
  createWriteStream
} = require('fs')

const [,, src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)
srcStream.on('data', (data) => destStream.write(data))

* Careful: this implementation is not optimal

*

Memory comparison (~600Mb file)

node --inspect-brk buffer-copy.js assets/poster.psd ~/Downloads/poster.psd

Memory comparison (~600Mb file)

node --inspect-brk stream-copy.js assets/poster.psd ~/Downloads/poster.psd

let's try with a big file (~10Gb)

let's try with a big file (~10Gb)

node --inspect-brk stream-copy.js assets/the-matrix-hd.mkv ~/Downloads/the-matrix-hd.mkv

👍 streams vs buffers 👎

  • Streams keep a low memory footprint even with large amounts of data
     
  • Streams allow you to process data as soon as it arrives

03. Stream types
       & APIs

All Streams are event emitters

A stream instance is an object that emits events when its internal state changes, for instance:

s.on('readable', () => {}) // ready to be consumed
s.on('data', (chunk) => {}) // new data is available
s.on('error', (err) => {}) // some error happened
s.on('end', () => {}) // no more data available

The events available depend from the type of stream

Readable streams

A readable stream represents a source from which data is consumed.

Examples:

  • fs readStream
  • process.stdin
  • HTTP response (client-side)
  • HTTP request (server-side)
  • AWS S3 GetObject (data field)

It supports two modes for data consumption: flowing and paused (or non-flowing) mode.

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

1

2

3

Source data

Readable stream in flowing mode

data listener

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

1

2

3

Source data

Readable stream in flowing mode

Read

data listener

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

1

2

3

Source data

Readable stream in flowing mode

data listener

data

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

2

3

Source data

Readable stream in flowing mode

data listener

Read

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

2

3

Source data

Readable stream in flowing mode

data listener

data

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

3

Source data

Readable stream in flowing mode

data listener

Read

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

3

Source data

Readable stream in flowing mode

data listener

data

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

Source data

Readable stream in flowing mode

Read

data listener

(end)

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

Source data

Readable stream in flowing mode

data listener

end

(end)

When no more data is available, end is emitted.

Readable streams

Data is read from source automatically and chunks are emitted as soon as they are available.

// count-emojis-flowing.js

const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm

const emojis = Object.keys(EMOJI_MAP)

const file = createReadStream(process.argv[2])
let counter = 0

file.on('data', chunk => {
  for (let char of chunk.toString('utf8')) {
    if (emojis.includes(char)) {
      counter++
    }
  }
})
file.on('end', () => console.log(`Found ${counter} emojis`))
file.on('error', err => console.error(`Error reading file: ${err}`))

Readable streams are also Async Iterators
(Node.js 10+)

// count-emojis-async-iterator.js
const { createReadStream } = require('fs')
const { EMOJI_MAP } = require('emoji') // from npm

async function main () {
  const emojis = Object.keys(EMOJI_MAP)
  const file = createReadStream(process.argv[2])
  let counter = 0

  for await (let chunk of file) {
    for (let char of chunk.toString('utf8')) {
      if (emojis.includes(char)) {
        counter++
      }
    }
  }

  console.log(`Found ${counter} emojis`)
}

main()

Writable streams

A writable stream is an abstraction that allows you to write data to a destination

 

Examples:

  • fs writeStream
  • process.stdout, process.stderr
  • HTTP request (client-side)
  • HTTP response (server-side)
  • AWS S3 PutObject (body parameter)
// writable-http-request.js
const http = require('http')

const req = http.request(
  {
    hostname: 'enx6b07hdu6cs.x.pipedream.net',
    method: 'POST'
  },
  resp => {
    console.log(`Server responded with "${resp.statusCode}"`)
  }
)

req.on('finish', () => console.log('request sent'))
req.on('close', () => console.log('Connection closed'))
req.on('error', err => console.error(`Request failed: ${err}`))

req.write('writing some content...\n')
req.end('last write & close the stream')

backpressure

When writing large amounts of data you should make sure you handle the stop write signal and the drain event

 

loige.link/backpressure

// stream-copy-safe.js

const { createReadStream, createWriteStream } = require('fs')

const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const destStream = createWriteStream(dest)

srcStream.on('data', data => {
  const canContinue = destStream.write(data)
  if (!canContinue) {
    // we are overflowing the destination, we should pause
    srcStream.pause()
    // we will resume when the destination stream is drained
    destStream.once('drain', () => srcStream.resume())
  }
})

Other types of stream

  • Duplex Stream
    streams that are both Readable and Writable. 
    (net.Socket)
     
  • Transform Stream
    Duplex streams that can modify or transform the data as it is written and read.
    (zlib.createGzip(), crypto.createCipheriv())

Anatomy of a transform stream

1. write data

transform stream

3. read transformed data

2. transform the data

(readable stream)

(writable stream)

Gzip example

1. write data

transform stream

3. read transformed data

2. transform the data

(readable stream)

(writable stream)

Uncompressed data

Compressed data

compress

zlib.createGzip()

How can we use transform streams?

Readable

Transform

Writable

⚡️   data

write()

⚡️   data

write()

pause()

⚡️ drain

resume()

pause()

⚡️ drain

resume()

(Backpressure)

(Backpressure)

You also have to handle end & error events!

// stream-copy-gzip.js
const { 
  createReadStream,
  createWriteStream
} = require('fs')
const { createGzip } = require('zlib')

const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)

srcStream.on('data', data => {
  const canContinue = gzipStream.write(data)
  if (!canContinue) {
    srcStream.pause()
    gzipStream.once('drain', () => {
      srcStream.resume()
    })
  }
})

srcStream.on('end', () => {
  // check if there's buffered data left
  const remainingData = gzipStream.read()
  if (remainingData !== null) {
    destStream.write()
  }
  gzipStream.end()
})

gzipStream.on('data', data => {
  const canContinue = destStream.write(data)
  if (!canContinue) {
    gzipStream.pause()
    destStream.once('drain', () => {
      gzipStream.resume()
    })
  }
})

gzipStream.on('end', () => {
  destStream.end()
})

// ⚠️ TODO: handle errors!

03. pipe()

readable.pipe(writableDest)

  • Connects a readable stream to a writable stream
  • A transform stream can be used as a destination as well
  • It returns the destination stream allowing for a chain of pipes
readable
  .pipe(tranform1)
  .pipe(transform2)
  .pipe(transform3)
  .pipe(writable)
// stream-copy-gzip-pipe.js

const { 
  createReadStream,
  createWriteStream
} = require('fs')
const { createGzip } = require('zlib')

const [, , src, dest] = process.argv
const srcStream = createReadStream(src)
const gzipStream = createGzip()
const destStream = createWriteStream(dest)

srcStream
  .pipe(gzipStream)
  .pipe(destStream)

Setup complex pipelines with pipe

readable
  .pipe(decompress)
  .pipe(decrypt)
  .pipe(convert)
  .pipe(encrypt)
  .pipe(compress)
  .pipe(writeToDisk)

This is the most common way to use streams

Handling errors (correctly)

readable
  .on('error', handleErr)
  .pipe(decompress)
  .on('error', handleErr)
  .pipe(decrypt)
  .on('error', handleErr)
  .pipe(convert)
  .on('error', handleErr)
  .pipe(encrypt)
  .on('error', handleErr)
  .pipe(compress)
  .on('error', handleErr)
  .pipe(writeToDisk)
  .on('error', handleErr)

 

handleErr should end and destroy the streams

(it doesn't happen automatically)

 

04. Stream utilities

stream.pipeline(...streams, callback) - Node.js 10+

// stream-copy-gzip-pipeline.js

const { pipeline } = require('stream')
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')

const [, , src, dest] = process.argv

pipeline(
  createReadStream(src),
  createGzip(),
  createWriteStream(dest),
  function onEnd (err) {
    if (err) {
      console.error(`Error: ${err}`)
      process.exit(1)
    }

    console.log('Done!')
  }
)

You can pass multiple streams (they will be piped)

The last argument is a callback. If invoked with an error, it means the pipeline failed at some point.

All the streams are ended and destroyed correctly.

For Node.js < 10: pump - npm.im/pump

// stream-copy-gzip-pump.js

const pump = require('pump') // from npm
const { createReadStream, createWriteStream } = require('fs')
const { createGzip } = require('zlib')

const [, , src, dest] = process.argv

pump( // just swap pipeline with pump!
  createReadStream(src),
  createGzip(),
  createWriteStream(dest),
  function onEnd (err) {
    if (err) {
      console.error(`Error: ${err}`)
      process.exit(1)
    }

    console.log('Done!')
  }
)

pumpify(...streams) - npm.im/pumpify

Create reusable pieces of pipeline

Let's create EncGz, an application that helps us to read and write encrypted-gzipped files

// encgz-stream.js - utility library

const {
  createCipheriv,
  createDecipheriv,
  randomBytes,
  createHash
} = require('crypto')
const { createGzip, createGunzip } = require('zlib')
const pumpify = require('pumpify') // from npm

// calculates md5 of the secret (trimmed)
function getChiperKey (secret) {}

function createEncgz (secret) {
  const initVect = randomBytes(16)
  const cipherKey = getChiperKey(secret)
  const encryptStream = createCipheriv('aes256', cipherKey, initVect)
  const gzipStream = createGzip()

  const stream = pumpify(encryptStream, gzipStream)
  stream.initVect = initVect

  return stream
}
// encgz-stream.js (...continue from previous slide)

function createDecgz (secret, initVect) {
  const cipherKey = getChiperKey(secret)
  const decryptStream = createDecipheriv('aes256', cipherKey, initVect)
  const gunzipStream = createGunzip()

  const stream = pumpify(gunzipStream, decryptStream)
  return stream
}

module.exports = {
  createEncgz,
  createDecgz
}
// encgz.js - CLI to encrypt and gzip (from stdin to stdout)

const { pipeline } = require('stream')
const { createEncgz } = require('./encgz-stream')

const [, , secret] = process.argv

const encgz = createEncgz(secret)
console.error(`init vector: ${encgz.initVect.toString('hex')}`)

pipeline(
  process.stdin,
  encgz,
  process.stdout,
  function onEnd (err) {
    if (err) {
      console.error(`Error: ${err}`)
      process.exit(1)
    }
  }
)
// decgz.js - CLI to gunzip and decrypt (from stdin to stdout)

const { pipeline } = require('stream')
const { createDecgz } = require('./encgz-stream')

const [, , secret, initVect] = process.argv

const decgz = createDecgz(secret, Buffer.from(initVect, 'hex'))


pipeline(
  process.stdin,
  decgz,
  process.stdout,
  function onEnd (err) {
    if (err) {
      console.error(`Error: ${err}`)
      process.exit(1)
    }
  }
)

readable-stream - npm.im/readable-stream

Npm package that contains the latest version of Node.js stream library.

It also makes Node.js streams compatible with the browser (can be used with Webpack and Broswserify)

* yeah, the name is misleading. The package offers all the functionalities in the official 'stream' package, not just readable streams.

*

04. Writing custom            streams

EmojiStream

Uppercasify

DOMAppend

🍋 Lemon

🍋 LEMON

  • 🍋 LEMON
class EmojiStream
  extends Readable {
    _read() {
      // ...
    }
}
class Uppercasify
  extends Transform {
   _transform(
     chunk,
     enc,
     done
    ) {
      // ...
    }
}
class DOMAppend
  extends Writable {
   _write(
     chunk,
     enc,
     done
    ) {
      // ...
    }
}

🍌 Banana

🍌 BANANA

  • 🍌 BANANA

this.push(data)

pass data to the next step

// emoji-stream.js (custom readable stream)
const { EMOJI_MAP } = require('emoji') // from npm
const { Readable } = require('readable-stream') // from npm
const emojis = Object.keys(EMOJI_MAP)
function getEmojiDescription (index) {
  return EMOJI_MAP[emojis[index]][1]
}
function getMessage (index) {
  return emojis[index] + ' ' + getEmojiDescription(index)
}

class EmojiStream extends Readable {
  constructor (options) {
    super(options)
    this._index = 0
  }

  _read () {
    if (this._index >= emojis.length) {
      return this.push(null)
    }
    return this.push(getMessage(this._index++))
  }
}

module.exports = EmojiStream
// uppercasify.js (custom transform stream)

const { Transform } = require('readable-stream')

class Uppercasify extends Transform {
  _transform (chunk, encoding, done) {
    this.push(chunk.toString().toUpperCase())
    done()
  }
}

module.exports = Uppercasify
// dom-append.js (custom writable stream)

const { Writable } = require('readable-stream')

class DOMAppend extends Writable {

  _write (chunk, encoding, done) {
    const elem = document.createElement('li')
    const content = document.createTextNode(chunk.toString())
    elem.appendChild(content)
    document.getElementById('list').appendChild(elem)
    done()
  }
}

module.exports = DOMAppend

05. Streams in the              browser

// browser/app.js

const EmojiStream = require('../emoji-stream')
const Uppercasify = require('../uppercasify')
const DOMAppend = require('../dom-append')

const emoji = new EmojiStream()
const uppercasify = new Uppercasify()
const append = new DOMAppend()

emoji
  .pipe(uppercasify)
  .pipe(append)
npm i --save-dev webpack webpack-cli

node_modules/.bin/webpack src/browser/app.js

# creates dist/main.js

mv dist/main.js src/browser/app-bundle.js

Let's use webpack to build this app for the browser

<!DOCTYPE html>
<html lang="en">
  <head>
    <meta charset="utf-8" />
    <meta
      name="viewport"
      content="width=device-width,initial-scale=1,shrink-to-fit=no"
    />
    <title>Streams in the browser!</title>
  </head>
  <body>
    <ul id="list"></ul>
    <script src="app.bundle.js"></script>
  </body>
</html>

Finally let's create an index.html

06. Closing

  • Streams have low memory footprint
  • Process data as soon as it's available
  • Composition through pipelines
  • Streams are abstractions:
    • Readable = Input
    • Transform = Business Logic
    • Writable = Output

TLDR;

If you want to learn (even) moar 🐻about streams...

If you Are not convinced yet...

curl parrot.live

Check out the codebase

Thank you!

We are hiring, talk to me or check out vectra.ai/about/careers

Credits

Cover picture by David Mark from Pixabay
emojiart.org for the amazing St. Patrick emoji art

The internet for the memes! :D

Special thanks

It’s about time to embrace Node.js Streams

By Luciano Mammino

It’s about time to embrace Node.js Streams

With very practical examples we'll learn how streams work in Node.js & the Browser. With streams, you will be able to write elegant JavaScript applications that are much more composable and memory efficient! Streams are probably one of the most beautiful features of Node.js, but still largely underestimated and rarely used. Once you'll grasp the fundamentals, you'll be able to solve some ordinary programming challenges in a much more elegant and efficient way. With streams power in your tool belt, you'll be able to write applications that can deal with gigabytes or even terabytes of data efficiently. This talk will cover the following topics: Streams: when and how; Different types of streams; Built-in and custom streams; Composability; Utils & Streams in the browser.

  • 4,703