You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/pages/product/apis-integrations/ai-api.mdx
+60Lines changed: 60 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -138,3 +138,63 @@ One way of handling this is to pass the error message back into the AI API; it m
138
138
#### 3. Continue wait
139
139
140
140
When using `"runQuery": true`, you might sometimes receive a query result containing `{ "error": "Continue wait" }`. If this happens, you should use `/load` ([described above](#2-load)) instead of `runQuery` to run the query, and handle retries as described in the [REST API documentation](/product/apis-integrations/rest-api#continue-wait).
141
+
142
+
## Advanced Features
143
+
### Custom prompts
144
+
145
+
You can prompt the AI API with custom instructions. For example, you may want it to always
146
+
respond in a particular language, or to refer to itself by a name matching your brand.
147
+
Custom prompts also allow you to give the model more context on your company and data model,
148
+
for example if it should usually prefer a particular view.
149
+
150
+
To use a custom prompt, set the `CUBE_CLOUD_AI_API_PROMPT` environment variable in your deployment.
151
+
152
+
<InfoBox>
153
+
Custom prompts add to, rather than overwrite, the AI API's existing prompting, so you
154
+
do not need to re-write instructions around how to generate the query itself.
155
+
</InfoBox>
156
+
157
+
### Meta tags
158
+
159
+
The AI API can read [meta tags](/reference/data-model/view#meta) on your dimensions, measures,
160
+
segments, and views.
161
+
162
+
Use the `ai` meta tag to give context that is specific to AI and goes beyond what is
163
+
included in the description. This can have any keys that you want. For example, you can use it
164
+
to give the AI context on possible values in a categorical dimension:
165
+
```yaml
166
+
- name: status
167
+
sql: status
168
+
type: string
169
+
meta:
170
+
ai:
171
+
values:
172
+
- shipped
173
+
- processing
174
+
- completed
175
+
```
176
+
177
+
### Other LLM providers
178
+
179
+
If desired, you may "bring your own" LLM model by providing a model and API credentials
180
+
for a supported model provider. Do this by setting environment variables in your Cube
181
+
deployment. See below for required variables by provider (required unless noted):
0 commit comments