Charly POLY
Past
➡️ JobTeaser alumni
➡️ 1 year @ A line
Now
Senior Software Engineer at
Première plateforme collaborative
de conseil, création et développement pour des projets marketing.
updateModel, createModel, deleteModel
{
models: {
chats: {
"8660f534-c425-4688-b4a9-d9ab11c6af85": { /* ... */ }
}
}
// ...
}
The chat and the timeline components
- post with file
- post with text only
- post with image
- post with note attached
- post with video
For a average "list chats" query:
➡️ 20-50 chats of all types (without paging)
➡️ lot of n+1, n+2 requests per chat
➡️ lot of redux store updates
➡️ lot of react components re-render 💥💥💥
paging
➡️ didn't solved requests issues
"includes" on API side with n+1 objects included in response
➡️ do not resolve n+2 queries issue
preload all chats in a dedicated /preload API endpoint
➡️ still some perf issue with realtime and updates refetches
Hydra
a custom client side relational cache with transactional redux dispatch
➡️ discover API request based on response data shape
➡️ wait all requests to finish before commit to redux
➡️ on update, ensure redux cache object relations are up-to-date
Example: a query on a chat can update a project object in cache
Users now have average of 80-100 chats
- client cache to many times invalid (too aggressive) 🔥
- API too slow 🔥
➡️ specific chat query with server-side optimisation
➡️ no more nesting issue (up to 4 levels easily)
➡️ models/data state handled by Apollo using Observables
➡️ advanced caching strategies for better UX
➡️ Very flexible and composable API
➡️ Support custom GraphQL schema without "Relay edges"
➡️ more complete options on caching strategies
➡️ easier migration
➡️ possibility to have a local GraphQL schema (state-link)
"cache-first" (default)
"cache-and-network"
"network-only"
"cache-only"
"no-cache"