You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
break: improving typescript support and refactoring the API
- Improving Typescript support for dynamic suggestion based on the
selected Session type.
- Break: Now LLM models must be defined inside `options` argument, it
allows a better typescript checking as well makes easier to extend the
API.
- There's no need to check if `inferenceHost` env var is defined, since
we can now switch between different LLM providers. Instead, we can
enable LLM support if the given type is an allowed provider.
`missing required parameter 'model' for type: '${type}'`,
64
+
);
65
+
}
66
+
67
+
this.options.baseURL??=core.ops.op_get_env(
68
+
"AI_INFERENCE_API_HOST",
69
+
)asstring;
70
+
71
+
if(!this.options.baseURL){
72
+
thrownewError(
73
+
`missing required parameter 'baseURL' for type: '${type}'`,
74
+
);
75
+
}
76
+
}
77
+
}
78
+
79
+
// /** @param {string | object} prompt Either a String (ollama) or an OpenAI chat completion body object (openaicompatible): https://platform.openai.com/docs/api-reference/chat/create */
0 commit comments