Skip to content

Conversation

@ncoghlan
Copy link
Collaborator

  • Breaking change: gpuOffload -> gpu in model loading config
  • gpuStrictVramCap is expected to be a temporary spelling

* Breaking change: gpuOffload -> gpu in model loading config
* gpuStrictVramCap is expected to be a temporary spelling
@github-actions github-actions bot added the CLA signed Indicates that all contributors have signed label Mar 20, 2025
@ncoghlan
Copy link
Collaborator Author

This is needed as part of adding GBNF structured response schema support and support for client side preset config requests.

Draft for now pending further consideration of:

  • gpuOffload -> gpu config field name change (the previous name wasn't final, but wasn't explicitly stated to be experimental at launch)
  • gpuStrictVramCap (this isn't expected to be the final spelling of this field name)

@ncoghlan ncoghlan marked this pull request as ready for review March 21, 2025 03:29
@ncoghlan
Copy link
Collaborator Author

Interim approach to handling the configuration APIs is to exclude them from any backwards compatibility guarantees (they're unfortunately not stable enough yet to commit to maintaining their current form for all fields)

@ncoghlan ncoghlan merged commit 796fcfa into main Mar 21, 2025
10 checks passed
@ncoghlan ncoghlan deleted the sync-lmstudio-js-20250320 branch March 21, 2025 03:36
@github-actions github-actions bot locked and limited conversation to collaborators Mar 21, 2025
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.

Labels

CLA signed Indicates that all contributors have signed

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants