You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -7,47 +7,70 @@ summary: "The document provides instructions on how to use Redis Cache with Post
7
7
---
8
8
# Using Redis Cache
9
9
10
-
[Redis](https://redis.io/) is a popular choice for distributed, in-memory caching.
10
+
Our implementation uses the [StackExchange.Redis library](https://stackexchange.github.io/StackExchange.Redis/) internally. It is compatible with on-premises Redis Cache instances as well as with the [Azure Redis Cache](https://azure.microsoft.com/en-us/services/cache/) cloud service. We tested our adapter with single-node, master/replica, and sharded topologies.
11
+
12
+
## Prerequisites
13
+
14
+
In theory, this component should work with any Redis implementation supported by `StackExchange.Redis`.
15
+
16
+
However, we only tested this component with the latest Redis version. Old versions or other implementations are not officially supported except on a ad-hoc basis for customers with an enterprise support plan.
17
+
18
+
We performed tests with single-node, master/replica, and sharded cluster deployments.
11
19
12
-
Our implementation uses [StackExchange.Redis library](https://stackexchange.github.io/StackExchange.Redis/) internally. It is compatible with on-premises Redis Cache instances as well as with the [Azure Redis Cache](https://azure.microsoft.com/en-us/services/cache/) cloud service. We tested our adapter with single-node, master/replica, and sharded topologies.
13
20
14
21
## Configuring the Redis server
15
22
16
-
### To prepare your Redis server for use with PostSharp caching:
23
+
To prepare your Redis server for use with PostSharp caching:
17
24
18
25
1. Set up the eviction policy to `volatile-lru` or `volatile-random`. See https://redis.io/topics/lru-cache#eviction-policies for details.
19
26
20
27
> [!CAUTION]
21
28
> Other eviction policies than `volatile-lru` or `volatile-random` are not supported.
22
29
23
-
```
24
-
maxmemory-policy volatile-lru
25
-
```
26
30
27
-
2. If you intend to enable events or use dependencies, configure keyspace notifications to include the `Exe` events. See https://redis.io/topics/notifications#configuration for details.
31
+
2. If you intend to enable events or use dependencies, configure keyspace notifications to include the `Exe` events (_key event_ notifications for expirations and evictions). See https://redis.io/topics/notifications#configuration for details.
28
32
29
-
```
30
-
notify-keyspace-events "Exe"
31
-
```
33
+
34
+
### Example
35
+
36
+
`redis.conf`:
37
+
38
+
```text
39
+
maxmemory-policy volatile-lru
40
+
notify-keyspace-events "Exe"
41
+
```
32
42
33
43
## Configuring the caching backend in PostSharp
34
44
35
-
### To set up PostSharp to use Redis for caching:
45
+
To set up PostSharp to use Redis for caching:
36
46
37
47
1. Add a reference to the [PostSharp.Patterns.Caching.Redis](https://www.nuget.org/packages/PostSharp.Patterns.Caching.Redis/) package.
38
48
39
49
2. Create an instance of [StackExchange.Redis.ConnectionMultiplexer](https://stackexchange.github.io/StackExchange.Redis/Configuration).
40
50
41
51
3. Create an instance of the <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> class using the <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend.Create*> factory method and assign the instance to <xref:PostSharp.Patterns.Caching.CachingServices.DefaultBackend>, passing a <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackendConfiguration> object.
42
52
53
+
4. Configure logging. See <xref:backends> to learn how to plug caching into your logging framework. <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> uses PostSharp Logging. While tuning performance, we recommend monitoring warnings.
54
+
43
55
> [!IMPORTANT]
44
-
> The caching backend has to be set before any cached method is called for the first time.
56
+
> If one node enables the local cache, all nodes must enable events.
57
+
58
+
In-memory cache must be enabled globally for the whole back-end. It is not possible to enable it at the level of individual classes or methods.
59
+
> [!IMPORTANT]
60
+
> The caching backend must be set before any cached method is called for the first time.
@@ -102,31 +125,10 @@ Garbage may therefore be due to three factors:
102
125
- The <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCacheDependencyGarbageCollector> component was temporarily disabled because of high system load.
103
126
- There was a race condition in setting a cache value (the version that loses becomes garbage).
104
127
105
-
## Support for Redis clusters
106
-
107
-
<xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> has been tested in single-node, master/replica, and sharded cluster deployments.
108
-
109
128
## Resilience
110
129
111
130
As with any part of a distributed system, <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> is a complex component that must be tuned and monitored with care.
112
131
113
-
### Enabling logging
114
-
115
-
<xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> uses PostSharp Logging. While tuning performance, we recommend monitoring warnings.
116
-
117
-
The following code shows how to enable logging for the caching component and redirect logging output to the console. See <xref:backends> to plug caching into your logging framework.
> If one node enables the local cache, all nodes must enable events.
127
-
128
-
In-memory cache must be enabled globally for the whole back-end. It is not possible to enable it at the level of individual classes or methods.
129
-
130
132
### Exception handling
131
133
132
134
Failures are highly likely if your system is overloaded. Here is how they are handled:
@@ -148,28 +150,30 @@ If you expect high load, it is recommended to tune the following <xref:PostSharp
148
150
The following <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCacheDependencyGarbageCollectorOptions> properties must also be properly tuned:
149
151
150
152
- The <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCacheDependencyGarbageCollectorOptions.CacheCleanupDelay> property is the delay between the initialization of the component and the first cleanup, then between two subsequent cache cleanups, and defaults to 4 hours. Cleaning up the database too frequently is useless performance overhead, but doing it too late degrades performance even more. If the database contains too much garbage, Redis will start evicting _useful_ data, affecting your application performance. However it will never evict garbage. That's why you should increase the cache cleanup frequency if you see high memory usage or high levels of evictions.
151
-
- The <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCacheDependencyGarbageCollectorOptions.CacheCleanupOptions.CacheCleanupOptions> property affects the cleanup process. It's important to keep the cleanup slow enough to avoid impacting your application's performance, but fast enough to finish before Redis runs out of memory. The <xref:PostSharp.Patterns.Caching.Implementation.CacheCleanupOptions.WaitDelay> is an artificial delay between processing each cache key, defaulting to 100 ms.
153
+
- The <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCacheDependencyGarbageCollectorOptions.CacheCleanupOptions.CacheCleanupOptions> property affects the cleanup process. It is important to keep the cleanup slow enough to avoid impacting your application's performance, but fast enough to finish before Redis runs out of memory. The <xref:PostSharp.Patterns.Caching.Implementation.CacheCleanupOptions.WaitDelay> is an artificial delay between processing each cache key, defaulting to 100 ms.
152
154
153
-
Note that you can run a manual cleanup by calling the <xref:PostSharp.Patterns.Caching.Implementation.CachingBackend.CleanUpAsync*?CachingBackend.CleanUpAsync> method. Do not run this method with the default options on a production database; these options are optimized for the cleanup operation's performance and may overload your server.
155
+
Note that you can run a manual cleanup by calling the <xref:PostSharp.Patterns.Caching.Implementation.CachingBackend.CleanUpAsync*?CachingBackend.CleanUpAsync> method. Do not run this method with the default options on a production database; these options are optimized for cleanup performance and may overload your server.
154
156
155
157
### Monitoring
156
158
157
-
The following metrics are relevant to assess the health of your caching set up:
159
+
The following metrics are relevant for assessing the health of your caching setup:
158
160
159
-
- Number of cache evictions per second. A high number might mean either insufficient caching memory, or ineffective caching strategy - caching things that are not worth it (too many cache misses)
160
-
- Number of cache expirations per second. A high number might mean too small expiration delays
161
-
- Number of warnings per minute in the caching component. A high number means that your system is overloaded.
161
+
- Number of cache evictions per second. A high number might indicate either insufficient caching memory or an ineffective caching strategy—caching things that are not worth it (too many cache misses)
162
+
- Number of cache expirations per second. A high number might indicate too-short expiration delays
163
+
- Number of warnings per minute in the caching component. A high number indicates that your system is overloaded.
162
164
163
165
If you want to gather statistics about cache hits and misses, you can do so by implementing a <xref:PostSharp.Patterns.Caching.Implementation.CachingBackendEnhancer> that overrides the <xref:PostSharp.Patterns.Caching.Implementation.CachingBackend.GetItemCore*> and <xref:PostSharp.Patterns.Caching.Implementation.CachingBackend.GetItemAsyncCore*> methods (a `null` return value means a cache miss).
164
166
165
-
## Data schema
167
+
## Data schema and complexity analysis
168
+
169
+
### Data schema
166
170
167
171
When dependencies are enabled, <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> relies on these keys:
2. Item value: `<prefix>:<schema-version>:{<item-key>}:val:<item-version>` — a list with values: `[<item-value>, <item-sliding-expiration>, <tag0>, <tag1> ... <tagn>]`.
171
-
3.Backward dependencies: `<prefix>:<schema-version>:{<dependency-key>}:bdep` — hash set of `<item-version>:<item-key>`.
172
-
4.Forward dependencies: `<prefix>:<schema-version>:{<item-key>}:fdep:<item-version>` — list of `<dependency-key>`.
175
+
3.Forward dependencies: `<prefix>:<schema-version>:{<item-key>}:fdep:<item-version>` — list of `<dependency-key>`.
176
+
4.Backward dependencies: `<prefix>:<schema-version>:{<dependency-key>}:bdep` — hash set of `<item-version>:<item-key>`.
173
177
174
178
When dependencies are disabled, only the item value record is used.
175
179
@@ -182,10 +186,84 @@ In this description, elements between angle brackets are placeholders and mean t
182
186
-`<item-version>` is a randomly generated item version.
183
187
-`<dependency-key>` is either a dependency key or a cache item key, when the cache item is itself a dependency (recursive dependencies), where `{`, `}` and `:` have been escaped.
184
188
185
-
## Clearing the cache
189
+
### Big O analysis
190
+
191
+
In the following analysis, we use the following parameters:
192
+
-`Items` is the number of items.
193
+
-`Dependencies` is the number of dependencies and items (as items can act as dependencies if a cached method calls another cached method).
194
+
-`KeySize` is the average size of item or dependency keys (after compression, if enabled).
195
+
-`ValueSize` is the average size of item values (i.e. the serialized data).
196
+
-`Dependencies_Per_Item` is the average number of dependencies per item (first level, not recursive).
197
+
-`Items_Per_Dependency` is the average number of items that a dependency or another item depends on (first level, not recursive).
198
+
-`Recursive_Items_Per_Dependency` is the average number of items that a dependency or another item depends on, recursively.
Race conditions affecting performance can happen when several operations attempt to set the same key. In this case, Redis transactions are used to achieve consistency, and they might be repeated in case of race. Adding items that share the same dependencies do not cause races and do not affect performance.
214
+
215
+
### Clearing the cache
186
216
187
217
To remove all cache keys, you can:
188
218
189
219
* Use the `FLUSHDB` Redis command to delete all keys in the selected database, even those that don’t match the prefix.
190
220
* Use the `SCAN <prefix>:*` command to identify all keys, then use `DEL` for each key.
191
221
* Use the <xref:PostSharp.Patterns.Caching.Implementation.CachingBackend.ClearAsync*> method, which does a `SCAN <prefix>:<schema-version>:*` command, then `UNLINK` for each key.
222
+
223
+
## Troubleshooting
224
+
225
+
### Observing the cache
226
+
227
+
- Make sure that `LoggingServices.DefaultBackend` has been properly configured so that the logging messages generated by <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> are routed to a service where you can monitor them.
228
+
229
+
- If necessary, increase the logging verbosity to `Info` (suitable for production in troubleshooting situations) or `Debug` (extremely verbose and never suitable for production). If you don't see any message in `Debug` verbosity it means that logging is not properly configured.
- Use the Redis `MONITOR` command in Redis CLI, or Redis Insight's profiler tool.
236
+
237
+
- Use the Redis `SUBSCRIBE _keyevents@0__*` command in Redis CLI, or use Redis Insight's Pub/Sub tool and subscribe to the pattern `_keyevents@0__*`.
238
+
239
+
### Common issues
240
+
241
+
- **Problem: Cache accesses never hit, always miss.**
242
+
243
+
Cause: <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackend> cannot connect to Redis and the `AbortOnConnect` option is set to `false`. You should see errors in your logs. If you don't, logging is not properly configured.
244
+
245
+
Remedy: Check that you can connect to the Redis server, for instance using `connection.GetDatabase.Ping()`.
246
+
247
+
- **Problem: the collector component reports dozens of errors per minute.**
248
+
249
+
Cause: the collector is overloaded because of excessive evictions and expirations.
250
+
251
+
Remedies:
252
+
253
+
- If expirations are excessive, increase the cache item expiry delay.
254
+
- If evictions are excessive, remove caching from methods with a high miss ratio.
255
+
- If the problem is intermittent, tune the <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackendConfiguration.BackgroundTasksMaxConcurrency> and <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackendConfiguration.BackgroundTasksOverloadedThreshold>
256
+
257
+
- **Problem: InvalidCacheItemException or InvalidCastException are logged after an application upgrade.**
258
+
259
+
Cause: the serialization of new and old data classes is not compatible with each other, but the two versions of the application use the same <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackendConfiguration.Prefix> and <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackendConfiguration.Database> properties.
260
+
261
+
Remedies:
262
+
263
+
- Use a different value of the <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackendConfiguration.Prefix> or <xref:PostSharp.Patterns.Caching.Backends.Redis.RedisCachingBackendConfiguration.Database> property when you update cached classes.
264
+
- Use a serializer that makes the data contract explicit, i.e. <xref:PostSharp.Patterns.Caching.Serializers.DataContractSerializer> or <xref:PostSharp.Patterns.Caching.Serializers.JsonCachingSerializer> instead of <xref:PostSharp.Patterns.Caching.Serializers.BinarySerializer> or <xref:PostSharp.Patterns.Caching.Serializers.PortableSerializer> -- and maintain serialization compatibility when you update the classes.
0 commit comments