You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# Summary
- The `click` CLI interface is tested with a bunch of unit tests
- The `main` function validation is added.
- `OpenAIBackend` initializer parameters are optimized
- target, host, and port parameters usage is simplified
- `openai.NotFound` **_available models_** error is handled
- `SerializableFileType` renamed to `SerializableFileExtension`
- `SerializableFileExtension` now inherits `str` to simplify usage,
since this Enum class is mostly used to work with strings.
- `rate_type_to_load_gen_mode` renamed to
`RATE_TYPE_TO_LOAD_GEN_MODE_MAPPER`
- `rate_type_to_profile_mode` renamed to
`RATE_TYPE_TO_PROFILE_MODE_MAPPER`
- CLI parameters are renamed:
- `--num-seconds` -> `--max-seconds`
- `--num-requests` -> `--max-requests`
- `path` removed from CLI arguments since it is not used
- .env `GUIDELLM` prefix is fixed
- Unused comments, settings, and code are removed
- Logger default unit test uses the injected logging settings object
- Module `backend.openai` has `_base_url` renamed to the `base_url`
- In `OpenAIBackend.make_request`, the `GenerativeResponse` always
counts `output_tokens` with `self._token_count`
- `SerializableFileExtensions` is replaced with pure Python strings
---------
Co-authored-by: Dmytro Parfeniuk <[email protected]>
Co-authored-by: Mark Kurtz <[email protected]>
Copy file name to clipboardExpand all lines: DEVELOPING.md
+31Lines changed: 31 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -228,6 +228,37 @@ The project is fully configurable with environment variables. With that configur
228
228
|`GUIDELLM__OPENAI__BASE_URL`|`http://localhost:8080`| The address to the **OpenAI-compatible** server.<br><br>OpenAI live base url is `https://api.openai.com/v1`|
229
229
|`GUIDELLM__OPENAI__API_KEY`|`invalid`| Corresponds to the **OpenAI-compatible** server API key.<br><br>If you look for the live key - check [this link](https://platform.openai.com/api-keys). |
230
230
231
+
<br>
232
+
233
+
## Project configuration
234
+
235
+
The project configuartion is powered by _[`🔗 pydantic-settings`](https://docs.pydantic.dev/latest/concepts/pydantic_settings/)_
236
+
237
+
The project configuration entrypoint is represented by lazy-loaded `settigns` singleton object ( `src/config/__init__` )
238
+
239
+
The project is fully configurable with environment variables. All the default values and
240
+
241
+
```py
242
+
classNestedIntoLogging(BaseModel):
243
+
nested: str="default value"
244
+
245
+
classLoggingSettings(BaseModel):
246
+
# ...
247
+
disabled: bool=False
248
+
249
+
250
+
classSettings(BaseSettings):
251
+
"""The entrypoint to settings."""
252
+
253
+
# ...
254
+
logging: LoggingSettings = LoggingSettings()
255
+
256
+
257
+
settings = Settings()
258
+
```
259
+
260
+
With that configuration set you can load parameters to `LoggingSettings()` by using environment variables. Just run `export GUIDELLM__LOGGING__DISABLED=true` or `export GUIDELLM__LOGGING__NESTED=another_value` respectfully. The nesting delimiter is `__`
261
+
231
262
## Contact and Support
232
263
233
264
If you need help or have any questions, please open an issue on GitHub or contact us at [email protected].
0 commit comments