Recently, we need to do reconstruction and optimization of react project. Because we haven't touched react for a long time, today we simply optimize a keyword search based demo, which is recorded below.
It is optimized from three aspects:
- Reduce the frequency of triggering events - debounce keyword typing
- Reduce HTTP requests - cache block duplicate HTTP requests
- Cache retirement strategy - use LRU to optimize cache
Reduce the trigger frequency of events - debounce
debounce is designed to control that events are triggered only on the last operation in a time period.
debounce principle: it is to maintain a timer, trigger the function after the specified delay time. If it is triggered again within the delay time, the previous timer will be cancelled and reset. In this way, only the last operation can be triggered.
The following is the code of debounce optimization in react:
... handler = e => { let val = e.target.value; if(val) { this.search(val); } this.setState(() => ({ value: e.target.value })) } debounce = (fn, delay) => { let timer = null; return function(event) { timer && clearTimeout(timer); event.persist && event.persist() // Reserved reference, asynchronous phase ACCESS prepared timer = setTimeout(() => { fn.call(this, event) }, delay) } } onChangeHandler = this.debounce(this.handler, 1000) ... render() { return ( <div> <input // value cannot be set here defaulValue={this.state.value} onChange={e => this.onChangeHandler(e)} placeholder="Try to type in some text" /> <div> <Suspense fallback="Loading"> {this.renderMovies} </Suspense> </div> </div> ); }
Note here: if you want to access the synthetic event object SyntheticEvent asynchronously, you need to call the persist() method or make a deep copy of the event object const event = {... Event} to keep the reference to the event.
When the React event is called, the React passed to the event handler is an instance of the synthetic event object, which is obtained by merging. This means that after the event callback is called, the synthetic event object is reused and all properties are cancelled. This is for performance reasons, so you cannot access the event asynchronously. Official document of React synthetic event
event.persist() // or const event: SyntheticEvent = { ...event }
Another obscure point needs to be pointed out. We know that if we want to make input a controlled element, the correct way is: when Binding value to input, we need to bind onChange event at the same time to listen for data changes. Otherwise, we will give the following warning.
But when you pass a synthetic event object asynchronously, the value of the input bound with the value attribute will not change (but it is still a controlled element).
... event.persist() timer = setTimeout(() => { fn.call(this, event) // Pass event }, delay) ... <input defaultValue={this.state.value} // value={this.state.value }Using the value attribute, the value does not change onChange={e => this.onChangeHandler(e)} />
As shown below:
Reduce HTTP requests
One of the ways to reduce HTTP requests is to cache the HTTP request results. If the url of the next request does not change, the data will be obtained directly from the cache.
import axios from 'axios'; const caches = {}; const axiosRequester = () => { let cancel; return async url => { if(cancel) { cancel.cancel(); } cancel = axios.CancelToken.source(); try { if(caches[url]) { //If the requested url has been submitted before, the request will not be made, and the data returned from the previous request will be returned return caches[url]; } const res = await axios.post(url, { cancelToken: cancel.token }) const result = res.data.result; caches[url] = result; //Take the url as the key, and the result as the data returned from the request, and store it return result; } catch(error) { if(axios.isCancel(error)) { console.log('Request canceled', error.message); } else { console.log(error.message); } } } } export const _search = axiosRequester();
When using axios for HTTP request, first judge whether the data has been cached according to the url. If it is hit, get the data directly from the cache. If it is not cached, an HTTP request is initiated, and the results returned from the request are saved in the caches object in the form of key value pairs.
Cache retirement strategy - LRU
As the cache space is limited, data storage can not be unlimited. When the storage capacity reaches a threshold, it will cause memory overflow. Therefore, in data caching, it is necessary to optimize the cache according to the situation and clear some data that may not be used again.
Here we use keepAlive The same cache retirement mechanism - LRU.
LRU - least recently used policy
- With time as a reference, if the data has been accessed recently, the probability of being accessed in the future will be higher. If the data is recorded with an array, when a data is accessed, the data will be moved to the end of the array, indicating that it has been used recently. When the cache overflows, the header data of the array will be deleted, that is, the least frequently used data will be removed.
To implement the LRU strategy, we need an array to store the key of the cache object:
const keys = [];
A threshold should be set to control the maximum storage quantity of the cache stack:
const MAXIMUN_CACHES = 20;
You also need a tool function remove to delete the key member items of the array:
function remove(arr, item) { if (arr.length) { var index = arr.indexOf(item) if (index > -1) { return arr.splice(index, 1) } } }
Finally, a pruneCacheEntry function is implemented to delete the least accessed data (the first item):
// The first entry of the keys array passed in if (keys.length > parseInt(MAXIMUN_CACHES)) { pruneCacheEntry(caches, keys[0], keys); } ... // Delete least accessed data function pruneCacheEntry ( caches, key, keys) { caches[key] = null; // Clear corresponding data delete caches[key]; // Delete cache key remove(keys, key); }
Finally, the optimized search function of "input anti shake" combined with LRU cache is as follows:
Same series of articles:
- Performance optimization brochure - asynchronous stack tracing: why await is better than Promise
- Performance optimization brochure - classified construction: make good use of webback hash
- Performance optimization brochure - Improve Web response speed: optimize your CDN performance
- Performance optimization brochure - programmable cache: Service Workers
- Performance optimization brochure - make pages render earlier: use preload to improve resource loading priority