The Definition of Caching
Caching is the process of using a system store to temporarily hold data for easy retrieval. It often serves as a way to optimize getting frequently-read data, reducing the amount of computation required to serve a high number of reads.
Before diving deeper into the concept, let’s get an overview of the different layers in software. Caching can be used at every level.
Three-Tier Architecture
Software applications are generally divided into a three-tier (three-layered) architecture. These layers include:
1. The presentation layer:
This is the user interface, through which the user interacts with the application. The use of localStorage
and caching in React are examples of caching in frontend development.
2. The application layer:
This layer is responsible for making logical decisions, processing data, and controlling the flow of information between the presentation layer and the database (store of data).
3. The data layer:
This layer is the location where data is stored and kept. Caching for the backend is often used in this data layer, to temporarily store a limited amount of data for fast retrieval.

Why It is Relevant to Software Engineers
Caching is a widely used technique within software. Various applications of it include web, mobile, as well as desktop applications. It's also used throughout the entire flow of data, including within database, middleware, and proxies.
A common example of caching online is the storage of HTML pages, images, and stylesheets after a webpage has been downloaded the first time. On later visits, the browser will try to load the page from its cache first, before requesting the page from the website's server again.
Access all course materials today
The rest of this tutorial's contents are only available for premium members. Please explore your options at the link below.