Skip to content

Commit 685e31e

Browse files
authored
Merge pull request #1841 from redis/DOC-5475
LangCache: Add API YAML for API reference
2 parents 94ba80e + 033a11e commit 685e31e

File tree

4 files changed

+647
-129
lines changed

4 files changed

+647
-129
lines changed
Lines changed: 122 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,122 @@
1+
---
2+
alwaysopen: false
3+
categories:
4+
- docs
5+
- develop
6+
- ai
7+
description: Learn to use the Redis LangCache API for semantic caching.
8+
hideListLinks: true
9+
linktitle: API and SDK examples
10+
title: Use the LangCache API and SDK
11+
weight: 10
12+
---
13+
14+
Use the LangCache API from your client app to store and retrieve LLM, RAG, or agent responses.
15+
16+
To access the LangCache API, you need:
17+
18+
- LangCache API base URL
19+
- LangCache service API key
20+
- Cache ID
21+
22+
When you call the API, you need to pass the LangCache API key in the `Authorization` header as a Bearer token and the Cache ID as the `cacheId` path parameter.
23+
24+
For example, to search the cache using `cURL`:
25+
26+
```bash
27+
curl -s -X POST "https://$HOST/v1/caches/$CACHE_ID/entires/search" \
28+
-H "accept: application/json" \
29+
-H "Authorization: Bearer $API_KEY" \
30+
-d "{ 'prompt': 'What is semantic caching' }"
31+
```
32+
33+
- The example expects several variables to be set in the shell:
34+
35+
- **$HOST** - the LangCache API base URL
36+
- **$CACHE_ID** - the Cache ID of your cache
37+
- **$API_KEY** - The LangCache API token
38+
39+
{{% info %}}
40+
This example uses `cURL` and Linux shell scripts to demonstrate the API; you can use any standard REST client or library.
41+
{{% /info %}}
42+
43+
You can also use the [LangCache SDKs](#langcache-sdk) for Javascript and Python to access the API.
44+
45+
## API examples
46+
47+
### Search LangCache for similar responses
48+
49+
Use `POST /v1/caches/{cacheId}/entries/search` to search the cache for matching responses to a user prompt.
50+
51+
```sh
52+
POST https://[host]/v1/caches/{cacheId}/entries/search
53+
{
54+
"prompt": "User prompt text"
55+
}
56+
```
57+
58+
Place this call in your client app right before you call your LLM's REST API. If LangCache returns a response, you can send that response back to the user instead of calling the LLM.
59+
60+
If LangCache does not return a response, you should call your LLM's REST API to generate a new response. After you get a response from the LLM, you can [store it in LangCache](#store-a-new-response-in-langcache) for future use.
61+
62+
You can also scope the responses returned from LangCache by adding an `attributes` object to the request. LangCache will only return responses that match the attributes you specify.
63+
64+
```sh
65+
POST https://[host]/v1/caches/{cacheId}/entries/search
66+
{
67+
"prompt": "User prompt text",
68+
"attributes": {
69+
"customAttributeName": "customAttributeValue"
70+
}
71+
}
72+
```
73+
74+
### Store a new response in LangCache
75+
76+
Use `POST /v1/caches/{cacheId}/entries` to store a new response in the cache.
77+
78+
```sh
79+
POST https://[host]/v1/caches/{cacheId}/entries
80+
{
81+
"prompt": "User prompt text",
82+
"response": "LLM response text"
83+
}
84+
```
85+
86+
Place this call in your client app after you get a response from the LLM. This will store the response in the cache for future use.
87+
88+
You can also store the responses with custom attributes by adding an `attributes` object to the request.
89+
90+
```sh
91+
POST https://[host]/v1/caches/{cacheId}/entries
92+
{
93+
"prompt": "User prompt text",
94+
"response": "LLM response text",
95+
"attributes": {
96+
"customAttributeName": "customAttributeValue"
97+
}
98+
}
99+
```
100+
101+
### Delete cached responses
102+
103+
Use `DELETE /v1/caches/{cacheId}/entries/{entryId}` to delete a cached response from the cache.
104+
105+
You can also use `DELETE /v1/caches/{cacheId}/entries` to delete multiple cached responses at once. If you provide an `attributes` object, LangCache will delete all responses that match the attributes you specify.
106+
107+
```sh
108+
DELETE https://[host]/v1/caches/{cacheId}/entries
109+
{
110+
"attributes": {
111+
"customAttributeName": "customAttributeValue"
112+
}
113+
}
114+
```
115+
## LangCache SDK
116+
117+
If your app is written in Javascript or Python, you can also use the LangCache Software Development Kits (SDKs) to access the API.
118+
119+
To learn how to use the LangCache SDKs:
120+
121+
- [LangCache SDK for Javascript](https://www.npmjs.com/package/@redis-ai/langcache)
122+
- [LangCache SDK for Python](https://pypi.org/project/langcache/)
Lines changed: 8 additions & 128 deletions
Original file line numberDiff line numberDiff line change
@@ -1,129 +1,9 @@
11
---
2-
alwaysopen: false
3-
categories:
4-
- docs
5-
- develop
6-
- ai
7-
description: Learn to use the Redis LangCache API for semantic caching.
8-
hideListLinks: true
9-
linktitle: API and SDK reference
10-
title: LangCache API and SDK reference
11-
weight: 10
12-
---
13-
14-
You can use the LangCache API from your client app to store and retrieve LLM, RAG, or agent responses.
15-
16-
To access the LangCache API, you need:
17-
18-
- LangCache API base URL
19-
- LangCache service API key
20-
- Cache ID
21-
22-
When you call the API, you need to pass the LangCache API key in the `Authorization` header as a Bearer token and the Cache ID as the `cacheId` path parameter.
23-
24-
For example, to check the health of the cache using `cURL`:
25-
26-
```bash
27-
curl -s -X GET "https://$HOST/v1/caches/$CACHE_ID/health" \
28-
-H "accept: application/json" \
29-
-H "Authorization: Bearer $API_KEY"
30-
```
31-
32-
- The example expects several variables to be set in the shell:
33-
34-
- **$HOST** - the LangCache API base URL
35-
- **$CACHE_ID** - the Cache ID of your cache
36-
- **$API_KEY** - The LangCache API token
37-
38-
{{% info %}}
39-
This example uses `cURL` and Linux shell scripts to demonstrate the API; you can use any standard REST client or library.
40-
{{% /info %}}
41-
42-
You can also use the [LangCache SDKs](#langcache-sdk) for Javascript and Python to access the API.
43-
44-
## API examples
45-
46-
### Check cache health
47-
48-
Use `GET /v1/caches/{cacheId}/health` to check the health of the cache.
49-
50-
```sh
51-
GET https://[host]/v1/caches/{cacheId}/health
52-
```
53-
54-
### Search LangCache for similar responses
55-
56-
Use `POST /v1/caches/{cacheId}/entries/search` to search the cache for matching responses to a user prompt.
57-
58-
```sh
59-
POST https://[host]/v1/caches/{cacheId}/entries/search
60-
{
61-
"prompt": "User prompt text"
62-
}
63-
```
64-
65-
Place this call in your client app right before you call your LLM's REST API. If LangCache returns a response, you can send that response back to the user instead of calling the LLM.
66-
67-
If LangCache does not return a response, you should call your LLM's REST API to generate a new response. After you get a response from the LLM, you can [store it in LangCache](#store-a-new-response-in-langcache) for future use.
68-
69-
You can also scope the responses returned from LangCache by adding an `attributes` object to the request. LangCache will only return responses that match the attributes you specify.
70-
71-
```sh
72-
POST https://[host]/v1/caches/{cacheId}/entries/search
73-
{
74-
"prompt": "User prompt text",
75-
"attributes": {
76-
"customAttributeName": "customAttributeValue"
77-
}
78-
}
79-
```
80-
81-
### Store a new response in LangCache
82-
83-
Use `POST /v1/caches/{cacheId}/entries` to store a new response in the cache.
84-
85-
```sh
86-
POST https://[host]/v1/caches/{cacheId}/entries
87-
{
88-
"prompt": "User prompt text",
89-
"response": "LLM response text"
90-
}
91-
```
92-
93-
Place this call in your client app after you get a response from the LLM. This will store the response in the cache for future use.
94-
95-
You can also store the responses with custom attributes by adding an `attributes` object to the request.
96-
97-
```sh
98-
POST https://[host]/v1/caches/{cacheId}/entries
99-
{
100-
"prompt": "User prompt text",
101-
"response": "LLM response text",
102-
"attributes": {
103-
"customAttributeName": "customAttributeValue"
104-
}
105-
}
106-
```
107-
108-
### Delete cached responses
109-
110-
Use `DELETE /v1/caches/{cacheId}/entries/{entryId}` to delete a cached response from the cache.
111-
112-
You can also use `DELETE /v1/caches/{cacheId}/entries` to delete multiple cached responses at once. If you provide an `attributes` object, LangCache will delete all responses that match the attributes you specify.
113-
114-
```sh
115-
DELETE https://[host]/v1/caches/{cacheId}/entries
116-
{
117-
"attributes": {
118-
"customAttributeName": "customAttributeValue"
119-
}
120-
}
121-
```
122-
## LangCache SDK
123-
124-
If your app is written in Javascript or Python, you can also use the LangCache Software Development Kits (SDKs) to access the API.
125-
126-
To learn how to use the LangCache SDKs:
127-
128-
- [LangCache SDK for Javascript](https://www.npmjs.com/package/@redis-ai/langcache)
129-
- [LangCache SDK for Python](https://pypi.org/project/langcache/)
2+
Title: LangCache REST API
3+
linkTitle: API reference
4+
layout: apireference
5+
type: page
6+
params:
7+
sourcefile: ./api.yaml
8+
sortOperationsAlphabetically: false
9+
---

0 commit comments

Comments
 (0)