You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: content/operate/rc/langcache/use-langcache.md
+2-17Lines changed: 2 additions & 17 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -11,7 +11,7 @@ title: Use the LangCache API with your GenAI app
11
11
weight: 10
12
12
---
13
13
14
-
You can use the LangCache API from your client app to store and retrieve LLM responses.
14
+
You can use the LangCache API from your client app to store and retrieve LLM, RAG, or agent responses.
15
15
16
16
To access the LangCache API, you need:
17
17
@@ -64,19 +64,14 @@ Place this call in your client app right before you call your LLM's REST API. If
64
64
65
65
If LangCache does not return a response, you should call your LLM's REST API to generate a new response. After you get a response from the LLM, you can [store it in LangCache](#store-a-new-response-in-langcache) for future use.
66
66
67
-
You can also limit the responses returned from LangCache by adding an `attributes` object or `scope` object to the request. LangCache will only return responses that match the attributes you specify.
67
+
You can also scope the responses returned from LangCache by adding an `attributes` object to the request. LangCache will only return responses that match the attributes you specify.
68
68
69
69
```sh
70
70
POST https://[host]/v1/caches/{cacheId}/search
71
71
{
72
72
"prompt": "User prompt text",
73
73
"attributes": {
74
74
"customAttributeName": "customAttributeValue"
75
-
},
76
-
"scope": {
77
-
"applicationId": "applicationId",
78
-
"userId": "userId",
79
-
"sessionId": "sessionId"
80
75
}
81
76
}
82
77
```
@@ -104,11 +99,6 @@ POST https://[host]/v1/caches/{cacheId}/entries
0 commit comments