You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
self.auxiliary_models =...# OpenAI clients auto-derived from ModelWrapper
101
102
102
103
@abstractmethod
103
104
defrun(self) -> List[Experience]:
@@ -110,7 +111,7 @@ During initialization, `Workflow` receives the following parameters:
110
111
111
112
-`task`({class}`trinity.common.workflows.Task`): A single data item from the task dataset.
112
113
-`model`({class}`trinity.common.models.model.ModelWrapper`): The model being trained, which provides an interface similar to OpenAI, capable of receiving a list of conversation messages and returning content generated by the LLM (including reply text `response_text`, full sequence token ids `tokens`, prompt part token length `prompt_length`, and a list of output token logprobs `logprobs`).
113
-
-`auxiliary_models`(`List[openai.OpenAI]`):A list of auxiliary models not involved in training. All are provided via OpenAI-compatible APIs.
114
+
-`auxiliary_models`(`List[ModelWrapper]`):A list of auxiliary model wrappers. You can access OpenAI clients via `self.auxiliary_models` (auto-derived based on workflow's `is_async`).
114
115
115
116
```{tip}
116
117
You can switch to using the OpenAI API by setting `explorer.rollout_model.enable_openai_api` to `true` in your config file and calling `model.get_openai_client()` to get an `openai.OpenAI` instance in your workflow.
0 commit comments