Client-side Concurrency

lwestby@alumni.nd.edu

@luke_dot_js

Concurrency is useful

  • Solve long-running problems
  • Maintain responsiveness

JavaScript needs concurrency

  • Turn-based execution model
  • Runs on the main thread
  • ~16.7ms turns to maintain 60fps

What do we do if we need more time?

Web Workers

A web worker is a script that runs separately fromĀ the main thread

Important points about Web Workers

  • Async message-passing communication
  • Heavy startup cost
  • Strict requirements around data-transfer
    • Structured cloning
    • Transfer of ownership

Basic Worker Creation

// A worker is loaded using the path to a separate
// file. The file must be on the same origin as the
// script that created it.
var WORKER_SCRIPT = 'js/web-workers/worker.js';

// Instantiate a new worker for this script
var worker = new Worker(WORKER_SCRIPT);

// Listen for messages from the worker and do work
// on the results
worker.addEventListener('message', function (ev) {
    processResult(ev.data);
});

// Pass a message into the worker
worker.postMessage({
    hello: 'world'
});

function processResult(data) {
    // ...
}

Inside a Web Worker

// The worker's global object is found at 'self'

// The worker listens for messages from the host
// on the same 'message' event
self.addEventListener('message', function (ev) {

    // Respond to the data passed by the host
    var result = processData(data);
    self.postMessage(result);
});

function processData(data) {
    // ...
}

Passing large data

// We're gonna pass a ton of data
var DATA_SIZE = 1000000;

// Create a worker
var worker = new Worker('js/web-workers/worker.js');

// Listen for the response, which will
// take a few seconds to compute
worker.addEventListener('message', function (ev) {
    var result = new Uint32Array(ev.data);
    processResult(result);
});

// Get the data as a Uint32Array
var data = new Uint32Array(DATA_SIZE);

// Pass a message with the data array's buffer,
// this time passing a second argument which is
// an array of references to transfer
worker.postMessage(data, [data.buffer]);

// At this point, ownership of data.buffer has
// been transferred to the worker and contains
// no data here in the main script

function processResult(result) {
    // ...
}

Use cases for workers

  • Background I/O with major parsing requirements
  • High-intensity number manipulation or generation
  • Image processing
  • Audio processing

Downsides of Web Workers

  • CumbersomeĀ API
  • Separate file requirements
  • Not well-suited for large dynamic data structures
    • Structured clone is slow
  • Browser support

Web Worker Abstractions

  • Operative.js
  • Parallel.js
  • Cataline.js

Operative.js

  • Most simplistic
  • Construct an API of concurrent operations
  • Results passed to a callback
  • No support for transfer-of-ownership
var worker = operative({

    scalingFactor: 5,

    scaleByFactor: function (a, done) {
        done(a * this.scalingFactor);
    }
});

worker.square(10, function (result) {
    console.log(result); // 50
});

Parallel.js

  • More features
  • Fluent computation construction
  • Promise-based API
  • No support for transfer-of-ownership
var p = new Parallel([1, 2, 3, 4, 5], {
    env: {
        scalingFactor: 5
    }
});

p.spawn(scaleAll).then(function (vals) {
    console.log(vals); // 5, 10, 15, 20, 25
});

function scaleAll(vals) {
    return vals.map(function (val) {
        return val * global.env.scalingFactor;
    });
}

// also provides map/reduce operations
p
.map(scale)
.reduce(add)
.then(function (result) {
    console.log(result); // 75
});


function scale(val) {
    return val * global.env.scalingFactor;
}
                    
function add(vals) {
    return vals[0] + vals[1];
}

Cataline.js

  • Combines the best of Parallel.js and Operative.js
  • Build APIs of concurrent operations
  • Promise-based
  • Supports transfer-of-ownership
  • Provides custom events
var worker = new cw({
    
    // this function will be called to initialize the worker
    initialize: function () {
        this.scaleFactor = 5;
    },

    scaleByFactor: function (val) {
        return val * this.scaleFactor;
    },

    processHugeData: function (data) {
         // ...
    }
});

worker.scaleByFactor(5).then(function (result) {
    console.log(result); // 25
});

// supports batching operations over arrays
worker
    .batch
    .scaleByFactor([1, 2, 3, 4, 5])
    .then(function (result) {
        console.log(result); // 5, 10, 15, 20, 25
    });

// can pass an ArrayBuffer and transfer ownership
var data = new Uint32Array(10000000);
worker
    .processHugeData(data, [data.buffer])
    .then(function (result) {
        // ...
    });

Conclusion

  • Web Workers are a good start
  • Abstraction libraries make it easier to use
  • But ...
  • Web Workers aren't well-suited to a lot of problems
  • We need to push for more/better APIs for concurrency and parallelization

Client-side Concurrency

By lukewestby

Client-side Concurrency

  • 1,032