Skip to content

Commit 8709ac6

Browse files
authored
Merge pull request #2054 from segmentio/fix-functions-cache-example
Fix functions cache example
2 parents 45eb1c6 + a152fd0 commit 8709ac6

File tree

1 file changed

+10
-4
lines changed

1 file changed

+10
-4
lines changed

src/_includes/content/functions/runtime.md

Lines changed: 10 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -31,18 +31,24 @@ Only the [`crypto` Node.js module](https://nodejs.org/dist/latest-v10.x/docs/api
3131

3232
##### Caching
3333

34-
Per-function global caching is available in the `cache` namespace. The following functions are available:
34+
Basic cache storage is available through the `cache` object, which has the following methods defined:
3535

3636
- `cache.load(key: string, ttl: number, fn: async () => any): Promise<any>`
37-
- Obtains a cached value for the provided `key`, invoking the callback if the value is missing or has expired. The `ttl` is the maximum duration in milliseconds the value can be cached. If omitted or set to `-1`, the value will have no expiry. There is no guarantee that a value will be retained in the cache for the provided duration, however. The cache space is limited, so efforts to minimize the cached value size will afford a higher cache hit ratio.
37+
- Obtains a cached value for the provided `key`, invoking the callback if the value is missing or has expired. The `ttl` is the maximum duration in milliseconds the value can be cached. If omitted or set to `-1`, the value will have no expiry.
3838
- `cache.delete(key: string): void`
39-
- Forcefully remove the value associated with the `key`.
39+
- Immediately remove the value associated with the `key`.
4040

41+
Some important notes about the cache:
42+
43+
- When testing functions in the code editor, the cache will be empty because each test temporarily deploys a new instance of the function.
44+
- Values in the cache are not shared between concurrently-running function instances; they are process-local which means that high-volume functions will have many separate caches.
45+
- Values may be expunged at any time, even before the configured TTL is reached. This can happen due to memory pressure or normal scaling activity. Minimizing the size of cached values can improve your hit/miss ratio.
46+
- Functions that receive a low volume of traffic may be temporarily suspended, during which their caches will be emptied. In general, caches are best used for high-volume functions and with long TTLs.
4147
The following example gets a JSON value through the cache, only invoking the callback as needed:
4248

4349
```js
4450
const ttl = 5 * 60 * 1000 // 5 minutes
45-
const val = await cache.load("mycachekey", ttl, () => {
51+
const val = await cache.load("mycachekey", ttl, async () => {
4652
const res = await fetch("http://echo.jsontest.com/key/value/one/two")
4753
const data = await res.json()
4854
return data

0 commit comments

Comments
 (0)