Releases: letta-ai/letta
Releases · letta-ai/letta
v0.16.6
Highlights
- Expanded Conversations API support for default conversation / agent-direct mode.
- New conversations now initialize with a compiled system message at creation time.
- Fixed
model_settings.max_output_tokensdefault behavior so it does not silently override existingmax_tokensunless explicitly set.
Conversations API updates
- Added support for
conversation_id="default"+agent_idacross conversation endpoints (send/list/cancel/compact/stream retrieve). - Kept backwards compatibility for
conversation_id=agent-*(deprecated path). - Added lock-key handling in agent-direct flows to avoid concurrent execution conflicts.
Conversation/system-message behavior
- Conversation creation now compiles and persists a system message immediately.
- This captures current memory state at conversation start and removes first-message timing edge cases.
Model/config updates
- Added model support for:
gpt-5.3-codexgpt-5.3-chat-latest
- Updated defaults:
- context window default: 32k → 128k
CORE_MEMORY_BLOCK_CHAR_LIMIT: 20k → 100k
- Anthropic model settings now allow
effort="max"where supported. - Gemini request timeout default increased to 600s.
Memory / memfs updates
- Git-backed memory frontmatter no longer emits
limit(legacylimitkeys are removed on merge). - Skills sync now maps only
skills/{name}/SKILL.mdtoskills/{name}block labels. - Other markdown under
skills/is intentionally ignored for block sync. - Memory filesystem rendering now includes descriptions for non-
system/files and condenses skill display.
Reliability and compatibility fixes
- Added explicit
LLMEmptyResponseErrorhandling for empty Anthropic streaming responses. - Improved Fireworks compatibility by stripping unsupported reasoning fields.
- Improved Z.ai compatibility by mapping
max_completion_tokenstomax_tokens.
Full Changelog: 0.16.5...0.16.6
v0.16.5
v0.16.4
What's Changed
- fix: update gh templates by @cpacker in #3155
- chore: release 0.16.3 by @sarahwooders in #3158
- chore: bump v0.16.4 by @carenthomas in #3168
Full Changelog: 0.16.2...0.16.4
v0.16.2
What's Changed
- docs: update README.md by @cpacker in #3110
- Update contributing.md with corrected local setup steps by @neversettle17-101 in #3123
- chore: bump version 0.16.2 by @carenthomas in #3140
New Contributors
- @neversettle17-101 made their first contribution in #3123
Full Changelog: 0.16.1...0.16.2
v0.16.1
What's Changed
- Correct provider name for openai-proxy in LLMConfig by @SootyOwl in #3097
- chore: bump v0.16.1 by @carenthomas in #3107
Full Changelog: 0.16.0...0.16.1
v0.16.0
What's Changed
- Updated readme with actual argument by @Godofnothing in #3083
- fix: Implement architecture-specific OTEL installation logic by @SootyOwl in #3061
- chore: bump v0.16.0 by @carenthomas in #3095
New Contributors
- @Godofnothing made their first contribution in #3083
- @SootyOwl made their first contribution in #3061
Full Changelog: 0.15.1...0.16.0
v0.15.1
v0.15.0
What's Changed
- Add context windows for grok-4 models by @runtimeBob in #3043
- chore: bump version 0.15.0 by @carenthomas in #3077
New Contributors
- @runtimeBob made their first contribution in #3043
Full Changelog: 0.14.0...0.15.0
v0.14.0
v0.13.0
What's Changed
- chore: clean up docs by @carenthomas in #3031
- feat: add haiku 4.5 as reasoning model by @AriWebb in #3038
- chore: bump version 0.13.0 by @carenthomas in #3050
New Contributors
Full Changelog: 0.12.1...0.13.0