What is the keep-alive component?
<keep-alive>is a built-in component of Vue that wraps components to cache component instances~
The official Vue document mentions:
- <keep-alive>is an abstract component: it does not render a DOM element by itself, nor does it appear in the parent component chain of the component.
- When a component is switched within <keep-alive>, its two life cycle hook functions, activated and deactivated, are executed accordingly.
keep-alive component principle
Matching and filtering
- include - String or regular expression.Only components with matching names will be cached.
- exclude - String or regular expression.Any components with matching names will not be cached.
Initialization
- The create phase initializes two variables, cache and keys.
- The mounted phase monitors the values of the include and exclude variables.
created () { this.cache = Object.create(null) this.keys = [] }, mounted () { this.$watch('include', val => { pruneCache(this, name => matches(val, name)) }) this.$watch('exclude', val => { pruneCache(this, name => !matches(val, name)) }) }, Copy Code
You can see that the pruneCache function is triggered if the value of the include or exclude changes, but the criteria for filtering are determined by the return value of the matches function, so let's look at it first.
Filter condition
function matches (pattern: string | RegExp | Array<string>, name: string): boolean { if (Array.isArray(pattern)) { return pattern.indexOf(name) > -1 } else if (typeof pattern === 'string') { return pattern.split(',').indexOf(name) > -1 } else if (isRegExp(pattern)) { return pattern.test(name) } /* istanbul ignore next */ return false } Copy Code
As you can see, a pattern can have three values:
- string
- RegExp
- Array<string>
It then determines if the name is within the value/range specified by the pattern and returns true or false based on the result.
Cache culling
Neither include nor exclude has the same value, and each time the filter condition changes, the key that does not meet the criteria needs to be "excluded" from the existing cache.
function pruneCache (keepAliveInstance: any, filter: Function) { const { cache, keys, _vnode } = keepAliveInstance for (const key in cache) { const cachedNode: ?VNode = cache[key] if (cachedNode) { const name: ?string = getComponentName(cachedNode.componentOptions) if (name && !filter(name)) { pruneCacheEntry(cache, key, keys, _vnode) } } } } Copy Code
You can see here that pruneCacheEntry is called again to "remove" key s that do not qualify for caching_
function pruneCacheEntry ( cache: VNodeCache, key: string, keys: Array<string>, current?: VNode ) { const cached = cache[key] if (cached && (!current || cached.tag !== current.tag)) { cached.componentInstance.$destroy() } cache[key] = null remove(keys, key) } Copy Code
Here's a detail:
// If the current vnode is empty, or cached and current are not the same node // Destroy cached by calling $destroy if (cached && (!current || cached.tag !== current.tag)) { cached.componentInstance.$destroy() } Copy Code
Remove when rendering
What we discussed above is how the filter conditions change, if not?In fact, the <keep-alive>component also filters the component every time it is rendered, ensuring that the cached component passes the filter condition_
render () { ... if ( // not included (include && (!name || !matches(include, name))) || // excluded (exclude && name && matches(exclude, name)) ) { return vnode } ... } Copy Code
Cache Elimination Strategy
In addition to the two parameters mentioned above, the <keep-alive>component can pass one parameter: max
An instance destruction rule is mentioned in the document: Cache the most recently accessed instance and destroy the last instance in the cache that was not accessed.This is actually a cache elimination strategy - the LRU algorithm.
More detailed rules:
- Do not consider destroying instances until the cache limit is reached
- Elimination has the lowest priority for the most recently accessed data
- Elimination has the highest priority for the least frequently accessed data
This means that we need two data structures to represent this algorithm, one to store the cache and the other to store and represent the old and new levels of cache access.
Similarities and differences between Vue 2.x and Vue 3.x implementations
Vue 2.x uses cache to store the cache and keys to store the old and new degrees of access in the cache, where cache is the object and keys is the array.
const { cache, keys } = this const key: ?string = vnode.key == null // same constructor may get registered as different local components // so cid alone is not enough (#3269) ? componentOptions.Ctor.cid + (componentOptions.tag ? `::${componentOptions.tag}` : '') : vnode.key if (cache[key]) { vnode.componentInstance = cache[key].componentInstance // make current key freshest remove(keys, key) keys.push(key) } else { cache[key] = vnode keys.push(key) // prune oldest entry if (this.max && keys.length > parseInt(this.max)) { pruneCacheEntry(cache, keys[0], keys, this._vnode) } } Copy Code
However, the LRU algorithm is still used in Vue 3.0, but the use of cache and keys is not different from 2.0, but the data structure has changed, where:
- cache was created with Map
- keys created with Set
const key = vnode.key == null ? comp : vnode.key const cachedVNode = cache.get(key) // clone vnode if it's reused because we are going to mutate it if (vnode.el) { vnode = cloneVNode(vnode) } cache.set(key, vnode) if (cachedVNode) { // copy over mounted state vnode.el = cachedVNode.el vnode.component = cachedVNode.component if (vnode.transition) { // recursively update transition hooks on subTree setTransitionHooks(vnode, vnode.transition!) } // avoid vnode being mounted as fresh vnode.shapeFlag |= ShapeFlags.COMPONENT_KEPT_ALIVE // make this key the freshest keys.delete(key) keys.add(key) } else { keys.add(key) // prune oldest entry if (max && keys.size > parseInt(max as string, 10)) { pruneCacheEntry(Array.from(keys)[0]) } } Copy Code
Obviously, Vue 3.x is better in time complexity because Set only needs to delete a key Time complexity, while deleting a key from an array takes time Time complexity.
However, even the Vue 3.x is a bit problematic, like this paragraph_
pruneCacheEntry(Array.from(keys)[0]) Copy Code
The keys are converted to an array and then to the first item, which can become more complex in a flash. This may be due to code volume and readability issues, and performance trade-offs, since caching a large number of components is also rare.
Extension - Implementation of common LRU algorithms
While implementing logic, the LRU algorithm needs to satisfy the following two points:
- The time complexity to delete and add the cache is
- The time complexity to find the cache is
The bidirectional loop chain table can satisfy condition 1 and the hash table can satisfy condition 2.Based on this, we can implement the LRU algorithm using a combination of hash table and two-way cyclic chain table.The following is the implementation code, based on a topic on LeetCode: LRU Cache Mechanism , interested students might as well finish reading and write on it.
Hash Table+Bidirectional Loop Chain Table Implementation of LRU Algorithm
/** * @param {number} capacity */ const LRUCache = function (capacity) { this.map = new Map() this.doubleLinkList = new DoubleLinkList() this.maxCapacity = capacity this.currCapacity = 0 }; /** * @param {number} val * @param {string} key */ const Node = function (val, key) { this.val = val this.key = key this.next = null this.prev = null } const DoubleLinkList = function () { this.head = null this.tail = null } /** * @param {Node} node * Add Node */ DoubleLinkList.prototype.addNode = function (node) { if (!this.head) { this.head = node this.tail = node this.head.next = this.tail this.head.prev = this.tail this.tail.next = this.head this.tail.prev = this.head } else { const next = this.head this.head = node this.head.next = next this.head.prev = this.tail this.tail.next = this.head next.prev = this.head } } /** * @param {Node} node * @return {void} * Delete Nodes in Bi-directional Chain List */ DoubleLinkList.prototype.deleteNode = function (node) { const prev = node.prev const next = node.next prev.next = next next.prev = prev // Make special judgments about end and end nodes if (this.head === node) { this.head = next } if (this.tail === node) { this.tail = prev } } /** * @param {number} key * @return {number} */ LRUCache.prototype.get = function(key) { const { map, doubleLinkList } = this // While accessing a key, delete it before placing it at the top of a two-way list if (map.has(key)) { const node = map.get(key) doubleLinkList.deleteNode(node) doubleLinkList.addNode(node) return node.val } return -1 }; /** * @param {number} key * @param {number} value * @return {void} */ LRUCache.prototype.put = function(key, value) { const { map, doubleLinkList } = this // If this key already exists // So just change the value of the key // Then place the node corresponding to the key at the head of the two-way Chain table. if (map.has(key)) { // Find him in map first const node = map.get(key) // And then delete it doubleLinkList.deleteNode(node) // Change Value node.val = value // Then add to the chain header doubleLinkList.addNode(node) map.set(key, node) } else { const node = new Node(value, key) // If you did not have this key before and the cache is full if (this.currCapacity === this.maxCapacity) { // Delete the tail node first map.delete(doubleLinkList.tail.key) doubleLinkList.deleteNode(doubleLinkList.tail) // Then add the new key to the list and map doubleLinkList.addNode(node) map.set(key, node) } else { // Conversely, the cache is not full // Add a new key directly to the list and map doubleLinkList.addNode(node) map.set(key, node) // Cache capacity self-increasing this.currCapacity++ } } }; Copy Code