You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -11,162 +11,64 @@ title: Use the LangCache API and SDK
11
11
weight: 10
12
12
---
13
13
14
-
Use the LangCache API from your client app to store and retrieve LLM, RAG, or agent responses.
14
+
Use the [LangCache API]({{< relref "/develop/ai/langcache/api-reference" >}}) from your client app to store and retrieve LLM, RAG, or agent responses.
15
+
16
+
You can use any standard REST client or library to access the API. If your app is written in Python or Javascript, you can also use the LangCache Software Development Kits (SDKs) to access the API:
17
+
18
+
-[LangCache SDK for Python](https://pypi.org/project/langcache/)
19
+
-[LangCache SDK for Javascript](https://www.npmjs.com/package/@redis-ai/langcache)
15
20
16
21
To access the LangCache API, you need:
17
22
18
23
- LangCache API base URL
19
24
- LangCache service API key
20
25
- Cache ID
21
26
22
-
When you call the API, you need to pass the LangCache API key in the `Authorization` header as a Bearer token and the Cache ID as the `cacheId` path parameter.
27
+
When you call the API, you need to pass the LangCache API key in the `Authorization` header as a Bearer token and the Cache ID as the `cacheId` path parameter.
23
28
24
-
For example, to search the cache using `cURL`:
29
+
For example:
25
30
26
-
```bash
27
-
curl -s -X POST "https://$HOST/v1/caches/$CACHE_ID/entires/search" \
curl -s -X POST "https://$HOST/v1/caches/$CACHE_ID/entries/search" \
28
33
-H "accept: application/json" \
29
34
-H "Authorization: Bearer $API_KEY" \
30
35
-d "{ 'prompt': 'What is semantic caching' }"
31
-
```
36
+
{{< /clients-example >}}
32
37
33
38
- The example expects several variables to be set in the shell:
34
39
35
40
-**$HOST** - the LangCache API base URL
36
41
-**$CACHE_ID** - the Cache ID of your cache
37
42
-**$API_KEY** - The LangCache API token
38
43
39
-
{{% info %}}
40
-
This example uses `cURL` and Linux shell scripts to demonstrate the API; you can use any standard REST client or library.
41
-
{{% /info %}}
42
-
43
-
If your app is written in Python or Javascript, you can also use the LangCache Software Development Kits (SDKs) to access the API:
44
-
45
-
-[LangCache SDK for Python](https://pypi.org/project/langcache/)
46
-
-[LangCache SDK for Javascript](https://www.npmjs.com/package/@redis-ai/langcache)
47
-
48
44
## Examples
49
45
50
46
### Search LangCache for similar responses
51
47
52
-
Use [`POST /v1/caches/{cacheId}/entries/search`]({{< relref "/develop/ai/langcache/api-reference#tag/Cache-Entries/operation/search" >}}}) to search the cache for matching responses to a user prompt.
48
+
Use [`POST /v1/caches/{cacheId}/entries/search`]({{< relref "/develop/ai/langcache/api-reference#tag/Cache-Entries/operation/search" >}}) to search the cache for matching responses to a user prompt.
Place this call in your client app right before you call your LLM's REST API. If LangCache returns a response, you can send that response back to the user instead of calling the LLM.
107
58
108
59
If LangCache does not return a response, you should call your LLM's REST API to generate a new response. After you get a response from the LLM, you can [store it in LangCache](#store-a-new-response-in-langcache) for future use.
109
60
110
61
You can also scope the responses returned from LangCache by adding an `attributes` object to the request. LangCache will only return responses that match the attributes you specify.
0 commit comments