Skip to content

Commit 823a7c9

Browse files
committed
LangCache: Add LangCache API reference with APIreference
1 parent 94ba80e commit 823a7c9

File tree

3 files changed

+637
-128
lines changed

3 files changed

+637
-128
lines changed
Lines changed: 121 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,121 @@
1+
---
2+
alwaysopen: false
3+
categories:
4+
- docs
5+
- develop
6+
- ai
7+
description: Learn to use the Redis LangCache API for semantic caching.
8+
hideListLinks: true
9+
linktitle: API and SDK examples
10+
title: Use the LangCache API and SDK
11+
weight: 10
12+
---
13+
14+
You can use the LangCache API from your client app to store and retrieve LLM, RAG, or agent responses.
15+
16+
To access the LangCache API, you need:
17+
18+
- LangCache API base URL
19+
- LangCache service API key
20+
- Cache ID
21+
22+
When you call the API, you need to pass the LangCache API key in the `Authorization` header as a Bearer token and the Cache ID as the `cacheId` path parameter.
23+
24+
For example, to check the health of the cache using `cURL`:
25+
26+
```bash
27+
curl -s -X GET "https://$HOST/v1/caches/$CACHE_ID/health" \
28+
-H "accept: application/json" \
29+
-H "Authorization: Bearer $API_KEY"
30+
```
31+
32+
- The example expects several variables to be set in the shell:
33+
34+
- **$HOST** - the LangCache API base URL
35+
- **$CACHE_ID** - the Cache ID of your cache
36+
- **$API_KEY** - The LangCache API token
37+
38+
{{% info %}}
39+
This example uses `cURL` and Linux shell scripts to demonstrate the API; you can use any standard REST client or library.
40+
{{% /info %}}
41+
42+
You can also use the [LangCache SDKs](#langcache-sdk) for Javascript and Python to access the API.
43+
44+
## API examples
45+
46+
### Search LangCache for similar responses
47+
48+
Use `POST /v1/caches/{cacheId}/entries/search` to search the cache for matching responses to a user prompt.
49+
50+
```sh
51+
POST https://[host]/v1/caches/{cacheId}/entries/search
52+
{
53+
"prompt": "User prompt text"
54+
}
55+
```
56+
57+
Place this call in your client app right before you call your LLM's REST API. If LangCache returns a response, you can send that response back to the user instead of calling the LLM.
58+
59+
If LangCache does not return a response, you should call your LLM's REST API to generate a new response. After you get a response from the LLM, you can [store it in LangCache](#store-a-new-response-in-langcache) for future use.
60+
61+
You can also scope the responses returned from LangCache by adding an `attributes` object to the request. LangCache will only return responses that match the attributes you specify.
62+
63+
```sh
64+
POST https://[host]/v1/caches/{cacheId}/entries/search
65+
{
66+
"prompt": "User prompt text",
67+
"attributes": {
68+
"customAttributeName": "customAttributeValue"
69+
}
70+
}
71+
```
72+
73+
### Store a new response in LangCache
74+
75+
Use `POST /v1/caches/{cacheId}/entries` to store a new response in the cache.
76+
77+
```sh
78+
POST https://[host]/v1/caches/{cacheId}/entries
79+
{
80+
"prompt": "User prompt text",
81+
"response": "LLM response text"
82+
}
83+
```
84+
85+
Place this call in your client app after you get a response from the LLM. This will store the response in the cache for future use.
86+
87+
You can also store the responses with custom attributes by adding an `attributes` object to the request.
88+
89+
```sh
90+
POST https://[host]/v1/caches/{cacheId}/entries
91+
{
92+
"prompt": "User prompt text",
93+
"response": "LLM response text",
94+
"attributes": {
95+
"customAttributeName": "customAttributeValue"
96+
}
97+
}
98+
```
99+
100+
### Delete cached responses
101+
102+
Use `DELETE /v1/caches/{cacheId}/entries/{entryId}` to delete a cached response from the cache.
103+
104+
You can also use `DELETE /v1/caches/{cacheId}/entries` to delete multiple cached responses at once. If you provide an `attributes` object, LangCache will delete all responses that match the attributes you specify.
105+
106+
```sh
107+
DELETE https://[host]/v1/caches/{cacheId}/entries
108+
{
109+
"attributes": {
110+
"customAttributeName": "customAttributeValue"
111+
}
112+
}
113+
```
114+
## LangCache SDK
115+
116+
If your app is written in Javascript or Python, you can also use the LangCache Software Development Kits (SDKs) to access the API.
117+
118+
To learn how to use the LangCache SDKs:
119+
120+
- [LangCache SDK for Javascript](https://www.npmjs.com/package/@redis-ai/langcache)
121+
- [LangCache SDK for Python](https://pypi.org/project/langcache/)
Lines changed: 7 additions & 128 deletions
Original file line numberDiff line numberDiff line change
@@ -1,129 +1,8 @@
11
---
2-
alwaysopen: false
3-
categories:
4-
- docs
5-
- develop
6-
- ai
7-
description: Learn to use the Redis LangCache API for semantic caching.
8-
hideListLinks: true
9-
linktitle: API and SDK reference
10-
title: LangCache API and SDK reference
11-
weight: 10
12-
---
13-
14-
You can use the LangCache API from your client app to store and retrieve LLM, RAG, or agent responses.
15-
16-
To access the LangCache API, you need:
17-
18-
- LangCache API base URL
19-
- LangCache service API key
20-
- Cache ID
21-
22-
When you call the API, you need to pass the LangCache API key in the `Authorization` header as a Bearer token and the Cache ID as the `cacheId` path parameter.
23-
24-
For example, to check the health of the cache using `cURL`:
25-
26-
```bash
27-
curl -s -X GET "https://$HOST/v1/caches/$CACHE_ID/health" \
28-
-H "accept: application/json" \
29-
-H "Authorization: Bearer $API_KEY"
30-
```
31-
32-
- The example expects several variables to be set in the shell:
33-
34-
- **$HOST** - the LangCache API base URL
35-
- **$CACHE_ID** - the Cache ID of your cache
36-
- **$API_KEY** - The LangCache API token
37-
38-
{{% info %}}
39-
This example uses `cURL` and Linux shell scripts to demonstrate the API; you can use any standard REST client or library.
40-
{{% /info %}}
41-
42-
You can also use the [LangCache SDKs](#langcache-sdk) for Javascript and Python to access the API.
43-
44-
## API examples
45-
46-
### Check cache health
47-
48-
Use `GET /v1/caches/{cacheId}/health` to check the health of the cache.
49-
50-
```sh
51-
GET https://[host]/v1/caches/{cacheId}/health
52-
```
53-
54-
### Search LangCache for similar responses
55-
56-
Use `POST /v1/caches/{cacheId}/entries/search` to search the cache for matching responses to a user prompt.
57-
58-
```sh
59-
POST https://[host]/v1/caches/{cacheId}/entries/search
60-
{
61-
"prompt": "User prompt text"
62-
}
63-
```
64-
65-
Place this call in your client app right before you call your LLM's REST API. If LangCache returns a response, you can send that response back to the user instead of calling the LLM.
66-
67-
If LangCache does not return a response, you should call your LLM's REST API to generate a new response. After you get a response from the LLM, you can [store it in LangCache](#store-a-new-response-in-langcache) for future use.
68-
69-
You can also scope the responses returned from LangCache by adding an `attributes` object to the request. LangCache will only return responses that match the attributes you specify.
70-
71-
```sh
72-
POST https://[host]/v1/caches/{cacheId}/entries/search
73-
{
74-
"prompt": "User prompt text",
75-
"attributes": {
76-
"customAttributeName": "customAttributeValue"
77-
}
78-
}
79-
```
80-
81-
### Store a new response in LangCache
82-
83-
Use `POST /v1/caches/{cacheId}/entries` to store a new response in the cache.
84-
85-
```sh
86-
POST https://[host]/v1/caches/{cacheId}/entries
87-
{
88-
"prompt": "User prompt text",
89-
"response": "LLM response text"
90-
}
91-
```
92-
93-
Place this call in your client app after you get a response from the LLM. This will store the response in the cache for future use.
94-
95-
You can also store the responses with custom attributes by adding an `attributes` object to the request.
96-
97-
```sh
98-
POST https://[host]/v1/caches/{cacheId}/entries
99-
{
100-
"prompt": "User prompt text",
101-
"response": "LLM response text",
102-
"attributes": {
103-
"customAttributeName": "customAttributeValue"
104-
}
105-
}
106-
```
107-
108-
### Delete cached responses
109-
110-
Use `DELETE /v1/caches/{cacheId}/entries/{entryId}` to delete a cached response from the cache.
111-
112-
You can also use `DELETE /v1/caches/{cacheId}/entries` to delete multiple cached responses at once. If you provide an `attributes` object, LangCache will delete all responses that match the attributes you specify.
113-
114-
```sh
115-
DELETE https://[host]/v1/caches/{cacheId}/entries
116-
{
117-
"attributes": {
118-
"customAttributeName": "customAttributeValue"
119-
}
120-
}
121-
```
122-
## LangCache SDK
123-
124-
If your app is written in Javascript or Python, you can also use the LangCache Software Development Kits (SDKs) to access the API.
125-
126-
To learn how to use the LangCache SDKs:
127-
128-
- [LangCache SDK for Javascript](https://www.npmjs.com/package/@redis-ai/langcache)
129-
- [LangCache SDK for Python](https://pypi.org/project/langcache/)
2+
Title: LangCache REST API
3+
linkTitle: API reference
4+
layout: apireference
5+
type: page
6+
params:
7+
sourcefile: ./api.yaml
8+
---

0 commit comments

Comments
 (0)