Sometimes it is less costly to store a result which is often needed than it is to compute it each time. Caching replaces computing time with memory use. Of course, one still must /find/ the result (which can be non-trivial in large systems with many results) and retrieve it (which may be non-trivial when the "result" is large or memory is in short supply... which is always).