Empowering Performance Culture

THINK

FIRST

Fast Fast

,

Alfredo Lopez

👋 Hi, I'm Alfredo

You can call me Al...

Artboard Created with Sketch.

Hearst is a media company with more than 360 businesses.

Its major interests include magazines, TV networks, newspapers and more.

Elle Logo

100+ websites

1 application

Bounce Rate ~2s

1. Get the executive buy in

Measure

Measure

Measure

Synthetic Testing

"Synthetic testing tools use simulated users to provide information on uptime, the performance of critical business transactions, and most common navigation paths."

RUM (Real User Monitoring)

const perfume = new Perfume({
  firstPaint: true,
  firstContentfulPaint: true,
  firstInputDelay: true,
  googleAnalytics: {
      enable: true,
      timingVar: "userId"
  }
})
// Perfume.js: First Paint 1482.00 ms
// Perfume.js: First Contentful Paint 2029.00 ms
// Perfume.js: First Input Delay 3.20 ms

What metrics

should I really care?

Graph

What metrics

should I really care?

Graph

FID measures the time from

when a user first interacts with your site

(i.e. when they click a link, tap on a button)

to the time when the browser is actually able to respond to that interaction.

Compare to Competitors

Benchmark Mobile Speed Index

Performance Calculator

WHERE IS THE BOTTLENECK?

~350KB = 15s TTI

ALL BYTES ARE NOT EQUAL

WHY IS JAVASCRIPT DIFFERENT?

JS

JPG

200 KB

200 KB

!==

DOWNLOAD -> PARSE -> COMPILE -> EXECUTE

<- 2018

V8 PARSING IMPROVEMENTS

2019 ->

V8 PARSING OPTIMIZATIONS

Before

360kb uncompressed

MAKE   IT   FASTER

Does the user need this right now?

Infer by user actions

// feed container
const feed = document.querySelector('.feed');
// get the closest element to the middle of the feed
const middleish = feed.children[Math.floor(feed.childElementCount / 2 )];

// create an IntersectionObserver
const io = new IntersectionObserver((entries) => {
    entries.forEach(async (entry) => {
        if (!entry.isIntersecting) return;
        io.disconnect();
        // import the module
        const infiniteLoad = await import('app/modules/infinite-load')
        infiniteLoad.setup(feed);
    })
})

// observe the middle element
io.observe(middleish);

Progressive Enhancements

CSS Target

<a class="nav-search-button" href="#searchoverlay" title="Search">
    <span class="icon icon-search"></span>
</a>
<section id=#searchoverlay">
    <!-- modal content !-->
</section>
#searchoverlay {
    position: fixed;
    width: 100vw;
    height: 100vh;
    transition: opacity 0.3s ease-in;
    z-index: -1;
}

#searchoverlay:target {
    z-index: 0;
    opacity: 1;
}

Details + Summary

<details class="search">
    <summary>Search</summary>
    <section>
      THIS IS A MODAL
    </section>
</details>
details summary::-webkit-details-marker {
    display:none;
}
details summary {
    cursor: pointer;
    outline: none !important;
    display: inline-block;
    /* etc */
}
details[open] > summary::before {
    position: fixed;
    cursor: default;
    content: " ";
    z-index: 99;
    background: rgba(27,31,35,0.5);
}
details > section {
    left: 50%;
    margin: 10vh auto;
    max-height: 80vh;
    max-width: 90vw;
    position: fixed;
    top: 0;
    transform: translateX(-50%);
}

Web Workers

// main.js
import * as Comlink from "comlink";

const worker = new Worker("worker.js");
// `app` lives in the worker
const app = await Comlink.wrap(worker);
const result = await app.doSomething();
console.log(result);
// worker.js
import * as Comlink from "comlink";

const app = {
  doSomething() {
    // perform operations
    return result;
  }
}
Comlink.expose(app);

Pros

  • Offload work from the Main Thread
  • Parallelism between workers

Cons

  • Cannot access the document

...but postMessage is slow!

No, it depends.

"Even on the slowest devices, you can postMessage() objects up to 100KiB and stay within your 100ms response budget. If you have JS-driven animations, payloads up to 10KiB are risk-free."

Worker DOM (Alpha)

<head>
    <!-- worker-dom library -->
    <script src="index.js" defer></script>
</head>
<body>
    <section src="app.js" class="app-script"><p>Hello World!</p><input/></section>
    <script defer>
        document.addEventListener('DOMContentLoaded', function() {
            // MainThread is defined by index.js, worker.js is also part of the library
            MainThread.upgradeElement(document.getElementsByClassName('app-script')[0], './worker.js');
        }, false);
    </script>
</body>
// app.js
const p = document.createElement('p');
const text = document.createTextNode('Hello World!');
const input = document.createElement('input');

p.appendChild(text);
document.body.appendChild(p);
document.body.appendChild(input);

function toggle() {
  p.style.color = p.style.color === "green" ? "red" : "green";
}

input.addEventListener('input', event => {
  if (event.currentTarget.value === 'change') {
    toggle();
  }
}, false);

Worker DOM (Alpha)

After

27kb Uncompressed

~4 seconds faster FCP

Why is it still slow?

First Party vs Third Party

First Party: 11 Requests

Tag Manager: 7th Request

Third Party: 125 Requests

Optimizing Performance is like going to the dentist 

Serverless Proxy

Lambda

Function

Loads page with ?third-parties

Tag manager loads all scripts

Storage

Collect all third-party files and store them

In-memory DB

Store the links

to the URL

On content invalidation,

or at a scheduled time

trigger the function

Looking back...

Focused on delivering results.

Teams felt disconnected from the purpose.

How fast is fast enough?

2. What if performance is part of the development process?

BEST PRACTICES

Divide and Conquer



Server



Application



Development

Potential Bottlenecks

Potential Bottlenecks

What do you intend to do?

Abstract frequent optimizations

getBoundingClientRect

// module-a.js
requestAnimationFrame(() => {
  const { right, bottom } = el.getBoundingClientRect();
});
// module-b.js
requestAnimationFrame(() => {
  el.classList.add('change-size'); 😱
});

Abstract frequent optimizations

IntersectionObserver

const io = new IntersectionObserver((entries) => {
  entries.forEach(entry => {
    //   entry.boundingClientRect
    //   entry.intersectionRatio
    //   entry.intersectionRect
    //   entry.isIntersecting
    //   entry.rootBounds
    //   entry.target
    //   entry.time
  });
});
// observe the element
io.observe(el);

            
const getBoundingClientRect = el =>
  new Promise((resolve) => {
    const io = new IntersectionObserver(([entry]) => {
      <mark>resolve(entry.boundingClientRect);</mark>
      io.disconnect();
    });
    io.observe(el);
  });
            
            
const touchDevice = 'ontouchstart' in window;

const addListener = (target, type, listener) => {
  // switch click to touchstart on touch devices
  if (type === 'click' && touchDevice) {
    type = 'touchstart';
  }

  target.addEventListener(type, listener);
}

Is it worth it?

Is it part of the critical path?

How frequent will it run?

Is it a shared or base component?

Your one-off is someone's pattern

Looking back...

Developers were challenged by constraints.

Features took longer to finish.

Education without immediate action is quickly forgotten.

We plan for today's problems...

But how can we prevent from blocking ourselves tomorrow?

3. Product and Design

1. Idea 2. Prototype 3. Test

4. Development 5. MVP 6. Optimize

This is where Performance mostly happens.

This is where we think

it should happen.

This is where it should happen.

What if it rocks?

Performance Paths

Why are we doing this?

Align best practices with goals

Render vs Runtime

Mobile vs Desktop

Document your tradeoffs

Specify the reasons:

Is it by design?

Is it a technical limitation?

Plan  your tradeoffs

Describe the scenarios a tradeoff would block

Prevent "Told you so" moments

How do we stay fast?

Performance Budgets are great , but...

It's rarely a single Pull Request

Predicting Regressions

Sp tter

Spotter

Will be open sourced soon

Spotter CI

Spotter CI

Collect all open PRs daily

with labels E.g

"Ready to Merge"

PR #123

PR #456

PR #789

Merge into a temp branch

PR #100

Open PR

and assign users

from a config value

The Problem

  • Single URL.

  • Run and block the build process.

  • No consolidated reports.

Lightkeeper

Maintain Lighthouse Budgets in Pull Request URLs.

...Any Github Check Run, Deployment or Status

Flexible Configuration

{
  "baseUrl": "https://example.com",
  "ci": "[ci]",
  "type": "[type]",
  "settings": {
    "categories": {
      "performance": 70,
      "accessibility": 70,
      "best-practices": 70,
      "pwa": 70
    },
    "budgets": [
      {
        "resourceSizes": [
          {
            "resourceType": "script",
            "budget": 300
          }
        ],
        "resourceCounts": [
          {
            "resourceType": "third-party",
            "budget": 50
          },
        ]
      }
    ]
  },
  "routes": [
    "/",
    { "url": "/articles", "settings": {...} }
  ]
}
{
  "baseUrl": "https://example.com",
  "ci": "[ci]",
  "type": "[type]",
  "settings": {
    "categories": {
      "pwa": {
        "target": 90,
        "threshold": 40,
        "warning": 10
      }
    }
  },
  "sharedSettings": {
    "galleries": {
      "extends": true,
      "categories": {
        "pwa": {
          "threshold": 20
        }
      },
      "lighthouse": {
        "options": {
          "emulatedFormFactor": "desktop",
          "extraHeaders": {
            "X-CUSTOM-HEADER": "gallery-header"
          }
        }
      }
    }
  },
  "routes": [
    "/article/1/",
    {
      "url": "gallery/1",
      "settings": "galleries"
    },
    {
      "url": "gallery/2",
      "settings": {
        "extends": "galleries",
        "categories": {
          "pwa": {
            "target": 80
          }
        }
      }
    }
  ]
}

Concise Comments

Lightkeeper Bot 🤖

npm i -D lightkeeperbot
lightkeeperbot [optional-url] --pr=123 --repo=owner/name --config-path=my-custom-config.js
language: node_js
node_js: lts/*

jobs:
  include:
      - stage: build
        name: "Builds Pull Request"
        install:
          - npm ci
        script:
          # Creates the Pull Request image
          - sleep 6
        after_success:
          # deploy/create the url
          - npx lightkeeperbot https://lightkeeper-test.lopez.now.sh/
          - echo 'continue without waiting for response'
      - stage: ui-tests
        name: "Unit Tests"
        script:
          - sleep 20
          - echo 'ran unit tests'
      - stage: integration-tests
        name: "Integration Tests"
        script:
          - sleep 20
          - echo 'ran integration tests'

Lightkeeper Bot 🤖

Lightkeeper Demo

Try it out!

Performance Culture is more than code.

It's not about fast or slow

But the choice to wander out

with a clear path back.

Thanks for listening!

Think Fast, First: Empowering Performance Culture | Full

By Alfredo Lopez

Think Fast, First: Empowering Performance Culture | Full

Making room for performance optimizations is hard. It means different things to different people and is considered a goal to achieve and maintain. But a performance culture is not about being slow or fast, but the ability to make tradeoffs and tackle competing priorities, with a clear path back.

  • 691