Apollo Client Caching
in Depth

Ben Newman

GraphQL Summit

8 November 2018

{ github,
  twitter,
  instagram,
  facebook
}.com/benjamn 

Why is it so important to have a GraphQL client?

GraphQL has plenty of value if all you do is send raw HTTP requests to a server

Using curl:

curl \
  -X POST \
  -H "Content-Type: application/json" \
  --data '{ "query": "{ posts { title } }" }' \
  https://1jzxrj179.lp.gql.zone/graphql

Using curl:

curl \
  -X POST \
  -H "Content-Type: application/json" \
  --data '{ "query": "{ posts { title } }" }' \
  https://1jzxrj179.lp.gql.zone/graphql

{
  "data": {
    "posts": [{
      "title": "Introduction to GraphQL"
    }, {
      "title": "Welcome to Apollo"
    }, {
      "title": "Advanced GraphQL"
    }]
  }
}

Using fetch:

await fetch('https://1jzxrj179.lp.gql.zone/graphql', {
  method: 'POST',
  headers: { 'Content-Type': 'application/json' },
  body: JSON.stringify({ query: '{ posts { title } }' }),
}).then(res => res.json())

Try this in your browser console!

Better than REST:

  • One round trip instead of many

  • No overfetching or underfetching

  • One endpoint, endlessly flexible

  • Predictable result structure

  • Enforcement of schema types

  • Schema introspection

  • Developer tools like GraphiQL

Worse than REST?

GraphQL HTTP responses cannot be usefully cached by the browser

  • Exact queries and exact responses, perhaps

  • But nothing more useful than that

Because HTTP caching is mostly useless, GraphQL clients must reimplement caching themselves

And that becomes a huge advantage!

  • Queries that are subsets of cached queries benefit from the cache

  • Queries that are supersets of cached queries can be simplified

  • Queries that are logically equivalent to other queries…

Only the client knows…

  • about @client resolvers

  • about optimistic mutation results

  • about local state written manually to the cache

  • about the contents of localStorage or IndexedDB

Generic HTTP caching will never solve these problems!

The more domain-specific your caching system is, the more efficient it can be

GraphQL gives us so many powerful tools to tame complexity…

Resolvers combine existing data sources while hiding the sins of past data models

Schemas elevate raw data by enforcing application-level object types and relationships

But resolvers can also sow chaos 

But resolvers can also s̵̹͘o̵̓w̶̗͌ c̴̜̐h̸͎͜͝ä̷̮̻̳̟o̶̭̼̩̠̓̎́s̶͓͔̹̪͝

Chaos in point: FakerQL

Schemas impose useful constraints, but they do not guarantee your graph is stable or coherent

The perils of mocking

const typeDefs = gql`
  type Query {
    hello: String,
    document: Document
  }

  type Document {
    id: String!
    currentVersion: Version!
  }

  type Version {
    id: String!
    downloadUrl: String!
    editors: [User]
  }

  type User {
    id: String!
    name: String!
  }
`;
const resolvers = {
  Query: {
    hello: () => 'Hello world!',
    document: () => {
      return {
        id: 'foo'
      }
    }
  },
  Document: {
    currentVersion: async () => ({
      id: uuid()
    })
  },
  Version: {
    downloadUrl: () => 'http://example.com',
    editors: () => [{
      id: '1',
      name: 'Foo Bar'
    }, {
      id: '2',
      name: 'Bar Baz'
    }]
  }
};

The perils of mocking

const documentQuery = gql`
  query {
    document {
      id
      currentVersion {
        id
        downloadUrl 
      }
    }
  }
`;

const editorsQuery = gql`
  query {
    document {
      id
      currentVersion {
        id
        editors {
          name
        }
      }
    }
  }
`;
const resolvers = {
  Query: {
    hello: () => 'Hello world!',
    document: () => {
      return {
        id: 'foo'
      }
    }
  },
  Document: {
    currentVersion: async () => ({
      id: uuid()
    })
  },
  Version: {
    downloadUrl: () => 'http://example.com',
    editors: () => [{
      id: '1',
      name: 'Foo Bar'
    }, {
      id: '2',
      name: 'Bar Baz'
    }]
  }
};

🤔

💭

🍌

If query results can be different any time a resolver is called, how can it be safe to cache query results on the client?

If you knew your GraphQL endpoint behaved like FakerQL, how could you justify caching anything?

If you knew your GraphQL endpoint might behave like FakerQL, how could you justify caching anything?

Crucial insight:

The job of the cache is not to predict what the server would say if you refetched a given query right now

The job of the cache is to ingest previously received query result trees into an internal format that resembles your data graph

Image credit: Dhaivat Pandya

The job of the cache is to ingest previously received query result trees into an internal format that resembles your data graph

So that logically related (structurally similar) queries mutually benefit from the cache

The job of the cache is to ingest previously received query result trees into an internal format that resembles your data graph

And active queries can be updated automatically when new data arrive, based on logical dependencies

The job of the cache is to ingest previously received query result trees into an internal format that resembles your data graph

Further reading: GraphQL Concepts Visualized

Apollo DevTools

Once previous results are decomposed into this normalized graph structure, the cache can respond to any variation of a previous query, as long as the data are available

That flexibility remains valuable even if some of the data are slightly stale

The decision to discard or continue using “stale” data ultimately belongs to the application

Forcing data to remain always up-to-date eventually becomes a performance liability

Solutions to staleness should prioritize data that matter most to the user

Caching based on the structure of data rather than the syntax of queries allows the data fetching load of your application to grow with the amount of data needed, not the number of queries

The GraphQL client cache is the hub of that coordination

Which cache implementation should you use?

Anyone can implement the ApolloCache interface, and they have!

What aspects of client cache performance really matter?

  • Avoiding network round-trips

  • Avoiding unnecessary re-rendering

  • Efficiently updating watched queries when new data become available

  • Memory footprint

This cache maintains an immutable & normalized graph of the values received from your GraphQL server. It enables the cache to return direct references to the cache, in order to satisfy queries. As a result, reads from the cache require minimal work (and can be optimized to constant time lookups in some cases). The tradeoff is that rather than receiving only the fields selected by a GraphQL query, there may be additional fields.

See yesterday's talk by Ian MacLeod for more details!

That tweet

Bouncing back from a distant last place, the @apollographql InMemoryCache is now neck-and-neck with Hermes on @convoyteam 's own benchmark suite, which means the two fastest #GraphQL caching systems are both implementations of the ApolloCache interface:

How?

How?

Ways to make existing software faster:

  • Stop doing unnecessary work

  • Do the same work faster

  • Parallelize the work

  • Prioritize work that matters now

  • Reuse results of previous work

  • Lower your standards?

Caching challenges

  • Staleness

    • What's the shelf life of a computation?
  • Sameness

    • When might the results of superficially different operations actually be the same?
  • Creepiness

    • Results worm their way into other parts of the system in spooky and indirect ways

  • Worthwhile-ness

    • Does the caching actually pay for itself?

A cache for a cache?

When readQuery is called, the cache must extract a tree-shaped query result from the normalized graph

Believe it or not, this is one of the most performance-sensitive parts of the cache!

Especially because broadcastQueries re-reads every watched query after any update

A cache for a cache?

What if we reused these result objects, instead of computing them from scratch every time?

Initial reads would not be any faster, but repeated reads could be nearly instantaneous!

Calling broadcastQueries with lots of query observers would be a lot less expensive

Easy example: data that never go stale



function fib(n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
}

Easy example: data that never go stale



function fib(n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
}

console.log(fib(7));  // 13

Easy example: data that never go stale



function fib(n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
}

console.log(fib(7));  // 13
console.log(fib(8));  // 21

Easy example: data that never go stale



function fib(n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
}

console.log(fib(7));  // 13
console.log(fib(8));  // 21
console.log(fib(9));  // 34

Easy example: data that never go stale



function fib(n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
}

console.log(fib(7));  // 13
console.log(fib(8));  // 21
console.log(fib(9));  // 34
console.log(fib(10)); // 55

Easy example: data that never go stale



function fib(n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
}

console.log(fib(7));  // 13
console.log(fib(8));  // 21
console.log(fib(9));  // 34
console.log(fib(10)); // 55
console.log(fib(78)); // ???

Easy example: data that never go stale

import { wrap } from "optimism"

function fib(n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
}

console.log(fib(7));  // 13
console.log(fib(8));  // 21
console.log(fib(9));  // 34
console.log(fib(10)); // 55
console.log(fib(78)); // ???

Easy example: data that never go stale

import { wrap } from "optimism"

const fib = wrap(function (n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
});

console.log(fib(7));  // 13
console.log(fib(8));  // 21
console.log(fib(9));  // 34
console.log(fib(10)); // 55
console.log(fib(78)); // ???

Easy example: data that never go stale

import { wrap } from "optimism"

const fib = wrap(function (n) {
  if (n < 2) return n;
  return fib(n - 1) + fib(n - 2);
});

console.log(fib(7));  // 13
console.log(fib(8));  // 21
console.log(fib(9));  // 34
console.log(fib(10)); // 55
console.log(fib(78)); // 8944394323791464

Less trivial: data that change without warning



function hashDirectory(dirPath) {
  const hashesByFile = {};

  readDirectory(dirPath).forEach(file => {
    const absPath = path.join(dirPath, file);
    if (isFile(absPath)) {
      hashesByFile[file] = hashFile(absPath);
    } else if (isDirectory(absPath)) {
      hashesByFile[file] = hashDirectory(absPath);
    }
  });

  return crypto.createHash("sha1")
    .update(JSON.stringify(hashesByFile))
    .digest("hex");
}

Less trivial: data that change without warning

import { wrap } from "optimism"

function hashDirectory(dirPath) {
  const hashesByFile = {};

  readDirectory(dirPath).forEach(file => {
    const absPath = path.join(dirPath, file);
    if (isFile(absPath)) {
      hashesByFile[file] = hashFile(absPath);
    } else if (isDirectory(absPath)) {
      hashesByFile[file] = hashDirectory(absPath);
    }
  });

  return crypto.createHash("sha1")
    .update(JSON.stringify(hashesByFile))
    .digest("hex");
}

Less trivial: data that change without warning

import { wrap } from "optimism"

const hashDirectory = wrap(function (dirPath) {
  const hashesByFile = {};

  readDirectory(dirPath).forEach(file => {
    const absPath = path.join(dirPath, file);
    if (isFile(absPath)) {
      hashesByFile[file] = hashFile(absPath);
    } else if (isDirectory(absPath)) {
      hashesByFile[file] = hashDirectory(absPath);
    }
  });

  return crypto.createHash("sha1")
    .update(JSON.stringify(hashesByFile))
    .digest("hex");
});

What now??

The optimism API

import { wrap, defaultMakeCacheKey } from "optimism"

The optimism API

import { wrap, defaultMakeCacheKey } from "optimism"

const cachedFunction = wrap(function originalFunction(a, b) {
  // Original function body...
}, {























});

The optimism API

import { wrap, defaultMakeCacheKey } from "optimism"

const cachedFunction = wrap(function originalFunction(a, b) {
  // Original function body...
}, {
  // Maximum number of cached values to retain.
  max: Math.pow(2, 16),





















});

The optimism API

import { wrap, defaultMakeCacheKey } from "optimism"

const cachedFunction = wrap(function originalFunction(a, b) {
  // Original function body...
}, {
  // Maximum number of cached values to retain.
  max: Math.pow(2, 16),

  // Optional function that can be used to simplify (or reduce
  // dimensionality) of the input arguments.
  makeCacheKey(a, b) {
    // Return a value that could be used as a key in a Map.
    // Returning nothing (undefined) forces the actual function
    // to be called, skipping all caching logic.
    // The default behavior works even if a or b are objects:
    return defaultMakeCacheKey(a, b);
  },











});

The optimism API

import { wrap, defaultMakeCacheKey } from "optimism"

const cachedFunction = wrap(function originalFunction(a, b) {
  // Original function body...
}, {
  // Maximum number of cached values to retain.
  max: Math.pow(2, 16),

  // Optional function that can be used to simplify (or reduce
  // dimensionality) of the input arguments.
  makeCacheKey(a, b) {
    // Return a value that could be used as a key in a Map.
    // Returning nothing (undefined) forces the actual function
    // to be called, skipping all caching logic.
    // The default behavior works even if a or b are objects:
    return defaultMakeCacheKey(a, b);
  },

  // Optional function that can be used to observe changes that
  // might affect the validity of the cached value.
  subscribe(a, b) {
    const watcher = fs.watch(a, () => {
      // Invalidates the cached value for these arguments, after
      // transforming them with makeCacheKey. Idempotent.
      cachedFunction.dirty(a, b);
    };
    return () => watcher.close();
  }
});

Once more, with optimism



function readDirectory(dir) {
  return fs.readdirSync(dir);
}

Once more, with optimism



function readDirectory(dir) {
  return fs.readdirSync(dir);
}








function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}

Once more, with optimism



function readDirectory(dir) {
  return fs.readdirSync(dir);
}








function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}

Once more, with optimism



function readDirectory(dir) {
  return fs.readdirSync(dir);
}








function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}








function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}

Once more, with optimism

import { wrap } from "optimism"

function readDirectory(dir) {
  return fs.readdirSync(dir);
}








function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}








function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}

Once more, with optimism

import { wrap } from "optimism"

const readDirectory = wrap(dir => {
  return fs.readdirSync(dir);
}, {






});

function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}








function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}

Once more, with optimism

import { wrap } from "optimism"

const readDirectory = wrap(dir => {
  return fs.readdirSync(dir);
}, {
  subscribe(dir) {




  }
});

function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}








function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}

Once more, with optimism

import { wrap } from "optimism"

const readDirectory = wrap(dir => {
  return fs.readdirSync(dir);
}, {
  subscribe(dir) {
    const watcher = fs.watch(dir, () => {
      readDirectory.dirty(dir);
    });

  }
});

function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}








function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}

Once more, with optimism

import { wrap } from "optimism"

const readDirectory = wrap(dir => {
  return fs.readdirSync(dir);
}, {
  subscribe(dir) {
    const watcher = fs.watch(dir, () => {
      readDirectory.dirty(dir);
    });
    return () => watcher.close();
  }
});

function isDirectory(path) {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}








function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}

Once more, with optimism

import { wrap } from "optimism"

const readDirectory = wrap(dir => {
  return fs.readdirSync(dir);
}, {
  subscribe(dir) {
    const watcher = fs.watch(dir, () => {
      readDirectory.dirty(dir);
    });
    return () => watcher.close();
  }
});

const isDirectory = wrap(path => {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}, {
  subscribe(path) { ... }
});
function isFile(path) {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}








function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}

Once more, with optimism

import { wrap } from "optimism"

const readDirectory = wrap(dir => {
  return fs.readdirSync(dir);
}, {
  subscribe(dir) {
    const watcher = fs.watch(dir, () => {
      readDirectory.dirty(dir);
    });
    return () => watcher.close();
  }
});

const isDirectory = wrap(path => {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}, {
  subscribe(path) { ... }
});
const isFile = wrap(path => {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}, {
  subscribe(path) {
    const watcher = fs.watch(path, () => {
      isFile.dirty(path);
    });
    return () => watcher.close();
  }
});

function hashFile(path) {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}
import { wrap } from "optimism"

const readDirectory = wrap(dir => {
  return fs.readdirSync(dir);
}, {
  subscribe(dir) {
    const watcher = fs.watch(dir, () => {
      readDirectory.dirty(dir);
    });
    return () => watcher.close();
  }
});

const isDirectory = wrap(path => {
  try {
    return fs.statSync(path).isDirectory();
  } catch (e) {
    return false;
  }
}, {
  subscribe(path) { ... }
});
const isFile = wrap(path => {
  try {
    return fs.statSync(path).isFile();
  } catch (e) {
    return false;
  }
}, {
  subscribe(path) {
    const watcher = fs.watch(path, () => {
      isFile.dirty(path);
    });
    return () => watcher.close();
  }
});

const hashFile = wrap(path => {
  return crypto.createHash("sha1")
    .update(fs.readFileSync(path))
    .digest("hex");
}, {
  subscribe(path) { ... }
});

Once more, with optimism

What happens when hashFile.dirty(path) is called?

How does this apply to apollo-cache-inmemory?







export class DepTrackingCache implements NormalizedCache {

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";





export class DepTrackingCache implements NormalizedCache {

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

  constructor(private data: NormalizedCacheObject = Object.create(null)) {





  }

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

  constructor(private data: NormalizedCacheObject = Object.create(null)) {
    this.depend = wrap((dataId: string) => this.data[dataId], {
      makeCacheKey(dataId: string) {
        return dataId;
      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

  constructor(private data: NormalizedCacheObject = Object.create(null)) {
    this.depend = wrap((dataId: string) => this.data[dataId], {
      makeCacheKey(dataId: string) {
        return dataId;
      }
    });
  }

  public get(dataId: string): StoreObject {
    this.depend(dataId);
    return this.data[dataId];
  }

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

  constructor(private data: NormalizedCacheObject = Object.create(null)) {
    this.depend = wrap((dataId: string) => this.data[dataId], {
      makeCacheKey(dataId: string) {
        return dataId;
      }
    });
  }

  public set(dataId: string, value: StoreObject) {
    const oldValue = this.data[dataId];
    if (value !== oldValue) {
      this.data[dataId] = value;
      this.depend.dirty(dataId);
    }
  }

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

  constructor(private data: NormalizedCacheObject = Object.create(null)) {
    this.depend = wrap((dataId: string) => this.data[dataId], {
      makeCacheKey(dataId: string) {
        return dataId;
      }
    });
  }

  public delete(dataId: string): void {
    if (Object.prototype.hasOwnProperty.call(this.data, dataId)) {
      delete this.data[dataId];
      this.depend.dirty(dataId);
    }
  }

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

  constructor(private data: NormalizedCacheObject = Object.create(null)) {
    this.depend = wrap((dataId: string) => this.data[dataId], {
      makeCacheKey(dataId: string) {
        return dataId;
      }
    });
  }

Nothing changes in a DepTrackingCache without these methods being called

How does this apply to apollo-cache-inmemory?

import { wrap } from "optimism";

type OptimisticWrapperFunction<
  T = (...args: any[]) => any
> = T & { dirty: T };

export class DepTrackingCache implements NormalizedCache {
  private depend: OptimisticWrapperFunction<(dataId: string) => StoreObject>;

  constructor(private data: NormalizedCacheObject = Object.create(null)) {
    this.depend = wrap((dataId: string) => this.data[dataId], {
      makeCacheKey(dataId: string) {
        return dataId;
      }
    });
  }

Which makes DepTrackingCache an extremely convenient bottleneck for dependency tracking

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  private executeSelectionSet({
    selectionSet,
    rootValue,
    execContext,
  }: ExecSelectionSetOptions): ExecResult {
    const {
      fragmentMap,
      contextValue,
      variableValues: variables,
    } = execContext;

    const object: StoreObject = contextValue.store.get(rootValue.id);
    
    const finalResult: ExecResult = {
      result: {},
    };
    
    // Recursively populate finalResult.result...
    
    return finalResult;
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {


















  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;

















  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {













    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {







      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {
        return defaultMakeCacheKey(





        );
      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {
        return defaultMakeCacheKey(
          execContext.contextValue.store,




        );
      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {
        return defaultMakeCacheKey(
          execContext.contextValue.store,
          execContext.query,



        );
      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {
        return defaultMakeCacheKey(
          execContext.contextValue.store,
          execContext.query,
          selectionSet,


        );
      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {
        return defaultMakeCacheKey(
          execContext.contextValue.store,
          execContext.query,
          selectionSet,
          JSON.stringify(execContext.variableValues),

        );
      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {
        return defaultMakeCacheKey(
          execContext.contextValue.store,
          execContext.query,
          selectionSet,
          JSON.stringify(execContext.variableValues),
          rootValue.id,
        );
      }
    });
  }

How does this apply to apollo-cache-inmemory?

import { wrap, defaultMakeCacheKey } from "optimism"

export class StoreReader {
  constructor() {
    const { executeSelectionSet } = this;
    this.executeSelectionSet = wrap((options: ExecSelectionSetOptions) => {
      return executeSelectionSet.call(this, options);
    }, {
      makeCacheKey({
        selectionSet,
        rootValue,
        execContext,
      }: ExecSelectionSetOptions) {
        if (execContext.contextValue.store instanceof DepTrackingCache) {
          return defaultMakeCacheKey(
            execContext.contextValue.store,
            execContext.query,
            selectionSet,
            JSON.stringify(execContext.variableValues),
            rootValue.id,
          );
        }
      }
    });
  }

How does this apply to apollo-cache-inmemory?

Other internal optimizations

Application-level optimizations 🤝

  • Take advantage of === equality

    • React.PureComponent or React.memo or shouldComponentUpdate can help

    • If you must modify result objects from the cache, be sure to write them back immediately

  • Avoid using an unbounded number of query documents

    • Internal optimizations rely on one-time preprocessing of query documents

Future plans 🧙‍♀️🔮

  • Stability!

    • Please keep updating apollo-cache-inmemory and related packages whenever there's a new version
    • We're not going to make any more huge changes until the dust has completely settled
  • Broadcasting queries 📡 needs more work, but the 🔥 has been put out 🚒

  • Invalidation 🙅‍♀️, garbage collection ♻️, deletion ✂️ (3 sides of the same coin)

  • Documentation for optimism

thank’s