The idea of a factor is a component of logical design that can be treated independently of the rest of the design or at least relatively so
Cache is one such factor.
Example A compiler is an interpreter with a cache. Logically the design can be factored as an interpreter coupled with a method for caching. But Most compilers are not designed that way.
Example DNS name resolution uses a distributed cache. Logically the design can be factored as a method for relaying requests coupled with a method for caching.
Example Dynamic programming is recursion with a cache. Logically (this is getting a bit repetitive) But Most textbooks on algorithms don't describe it that way.
Example LazyEvaluation can be 'derived' from StrictEvaluation by adding a cache. "the runtime overhead consists of: allocating and initializing an object, making a method call to trigger the calculation, testing for a cached result and caching the result if needed"
Help! Which design patterns is this related to?
It's an OptimizationPattern... CacheCalculations.
Advantages
A cache can often be added as an afterthought to an algorithm that runs too slowly. This optimisation is commonly called memoisation (or MemoIzation in the US :-)
Things to consider
Using a HashTable. Cache element expiry. Use of random expiry can make chance of worst case behaviour negligible.
See also: MemoizationInPython