You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/blog/cache-strategies.md
+35-4Lines changed: 35 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,19 +11,25 @@ priority: 1
11
11
---
12
12
13
13
## What is Caching?
14
-
Caching is a technique that enhances application performance by temporarily storing frequently accessed data in high-speed data storage hardware like RAM. This temporary storage is called a Cache. Instead of repeatedly retrieving data from the original/primary Data source, the application can quickly access it from the Cache. This reduces response time and improves the overall throughput of the application.
15
14
15
+
Caching is a technique that enhances application performance by temporarily storing frequently accessed data in high-speed data storage hardware like RAM. This temporary storage is called a Cache. Instead of repeatedly retrieving data from the original/primary Data source, the application can quickly access it from the Cache. This reduces response time and improves the overall throughput of the application.
16
16
17
17
## Why do we need Caching?
18
+
18
19
There are various benefits of Caching the frequently accessed data at different layers in caching:
19
20
20
21
-**Reduced Latency:** It helps reduce the latency by storing the frequently accessed data in high-speed data storage hardware making the data retrieval faster.
22
+
21
23
-**Reduced Load on Backend Systems:** It helps reduce the backend system load by serving the frequently accessed data from the Cache.
24
+
22
25
-**Cost Efficiency:** It reduces the no. of requests made to the original/primary data source or some external services hence lowering the operating cost.
26
+
23
27
-**Better User Experience:** It reduces the application's load time, makes it user-friendly, and increases engagement.
28
+
24
29
-**Improved Scalability:** As it reduces the response time of the request, the caching allows the application to handle more requests simultaneously, improving the overall throughput of the system(read-throughput).
25
30
26
31
## Various Layers of Caching
32
+
27
33
Caching can be applied at various levels of an application stack, each serving a specific purpose and enhancing the performance in its way. Understanding these various types of cache and their use cases is essential to design an efficient system.
28
34
29
35
**CDN(Content Delivery Network) Cache:** A CDN cache is used to store copies of static assets (e.g., images, videos, CSS, JavaScript files) closer to the end users across geographically distributed servers. The goal is to reduce latency and reduce traffic from the origin server by serving cached content directly from the nearest CDN server.
@@ -56,65 +62,90 @@ Applications with high volumes of static content or media use CDN caching to red
**Cache-Aside:** In this strategy, the application code is responsible for managing the Cache. It handles when to read from the cache or fetch data from the underlying data source. The sequence of actions followed in the Cache-Aside strategy is as follows:
68
+
63
69
- Check cache for data(Cache-Hit): When a request is made, the application first queries the data in the Cache. If the requested data is found(Cache-Hit), it is directly returned.
70
+
64
71
- Fetch from Data Source(Cache-Miss): If the data is not found in the Cache(Cache-Miss), the application retrieves the required data from the underlying data source(e.g., Database or API).
72
+
65
73
- Update Cache with Fresh Data: After fetching the data from the primary data source, the application updates the cache with fresh data for future requests.
74
+
66
75
- Return the data to the client/caller: Finally, the application returns the data to the client/caller, whether from the cache or after being fetched from the data source.
**Read Through:** In this strategy, the responsibility for managing the cache lies with the cache layer rather than the application code. The application layer calls the cache layer to request data, and the cache layer handles whether to read from the cache or fetch data from the underlying data source. The sequence of actions followed in the Read-Through strategy is as follows:
83
+
73
84
- Check cache for Data (Cache-Hit): When a request is made, the cache layer first checks for the requested data in the cache. If the data is found (Cache Hit), it is directly returned from the cache.
85
+
74
86
- Fetch from Data Source (Cache-Miss): If the data is not found in the cache (Cache Miss), the cache layer automatically fetches the data from the underlying data source (e.g., a database or API).
87
+
75
88
- Update Cache with Fresh Data: Once the data is retrieved from the primary data source, the cache layer updates the cache with the fresh data for future requests.
89
+
76
90
- Return the Data to the Client/Caller: Finally, the data is returned to the client/caller, whether it was retrieved from the cache or fetched from the data source.
91
+
77
92
<br/>
78
93
<palign="center">
79
94
<imgwidth="800px"src="/images/blog/cache-strategies/read-through.png"alt="Read Through Cache Strategy">
80
95
</p>
81
96
82
97
**Write Around:** In this strategy, the application code is responsible for managing the Cache. When data is written or updated, it is directly sent to the primary data source, and the corresponding cache entry is cleared if present.
98
+
83
99
- On the next read request, the process follows the same as the Cache-Aside strategy. The sequence of actions followed in the Write Around strategy is as follows:
100
+
84
101
- Write/Update to Data Source and Clear Cache: When new data is written or existing data is updated, the application stores it in the primary data source and clears the relevant cache entry, if present.
102
+
85
103
- Check Cache for Data (Cache-Hit on Read): When a read request is made, the application first checks the cache. If the data is available (Cache-Hit), it is returned immediately.
104
+
86
105
- Fetch from Data Source (Cache-Miss on Read): If the requested data is not found in the cache (Cache-Miss), the application retrieves it from the underlying data source (e.g., database, API).
106
+
87
107
- Update Cache with Fresh Data: After fetching the data, the application updates the cache for quicker future access.
108
+
88
109
- Return the Data to the Client/Caller: The fetched data, whether from the cache or the primary data source, is returned to the client/caller.
110
+
89
111
<br/>
90
112
<palign="center">
91
113
<imgwidth="800px"src="/images/blog/cache-strategies/write-around.png"alt="Write Around Cache Strategy">
92
114
</p>
93
115
94
116
**Write Through:** In this strategy, the cache layer manages the data in the cache and the underlying data source. When data is written or updated, the cache layer ensures the changes are synchronized in both locations. The sequence of actions followed in the Write-Through strategy is as follows:
117
+
95
118
- Write/Update to Cache: When new data is written or existing data is updated, the application layer calls the cache layer, which first writes or updates the data in the cache.
119
+
96
120
- Synchronously Write/Update to Data Source: After updating the cache, the cache layer immediately writes or updates the data in the underlying data source (e.g., a database or API) in a synchronous manner to ensure consistency between the cache and the data source.
121
+
97
122
<br/>
98
123
<palign="center">
99
124
<imgwidth="800px"src="/images/blog/cache-strategies/write-through.png"alt="Write Through Cache Strategy">
100
125
</p>
101
126
102
127
**Write Behind:** In this strategy, the cache layer manages data updates asynchronously. When data is written or updated, the application layer calls the cache layer, which immediately updates the cache while delaying the write to the primary data source. The data is then written to the underlying data source after a specified interval, with the cache maintaining a buffer of changes until the update occurs. The sequence of actions followed in the Write-Behind strategy is as follows:
128
+
103
129
- Write/Update to Cache: When new data is written or existing data is updated, the application layer calls the cache layer, which immediately updates the data in the cache.
130
+
104
131
- Buffer Data in Cache: The cache layer maintains a buffer of the updated data in the cache, temporarily holding the changes before committing them to the primary data source.
132
+
105
133
- Asynchronously Write/Update to Data Source: After a specified interval, the cache layer writes or updates the buffered data in the primary data source (e.g., a database or API), ensuring eventual consistency between the cache and the data source.
In this article, we discussed what is caching, why we need caching, different levels of caching, and various caching strategies that we use for server-side caching.
113
143
Finally, I would say where to cache the data and what strategy to use for caching entirely depends on the application’s requirements.
114
144
115
145
That was all for this article. I hope you found it useful!
0 commit comments