Skip to content

Commit 447f65f

Browse files
GeneAIclaude
andcommitted
feat: Add configurable max_tokens support to Summarizer
## Configuration Support Implemented max_tokens configuration support to allow users to control Claude API response length via configuration files. ### Changes **memdocs/summarize.py**: - Added `max_tokens` parameter to `Summarizer.__init__()` with default=DEFAULT_MAX_TOKENS (4096) - Updated `Summarizer.summarize()` to use `self.max_tokens` instead of hardcoded constant - Enhanced docstrings to document the new parameter - Stored max_tokens as instance variable for flexible configuration **memdocs/cli_modules/commands/review_cmd.py**: - Updated Summarizer instantiation to pass `doc_config.ai.max_tokens` - Ensures CLI respects configuration file settings ### Configuration Flow ``` .memdocs.yml (max_tokens: 8192) ↓ AIConfig schema (default: 4096) ↓ Summarizer class (uses configured value) ↓ Claude API request ``` ### Backward Compatibility ✅ Default parameter value maintains existing behavior (4096 tokens) ✅ All existing code that calls `Summarizer()` without parameters works unchanged ✅ All tests pass (304 passed, 9 skipped) ✅ No breaking changes to API ### Configuration Hierarchy 1. **DEFAULT_MAX_TOKENS = 4096** - Constant used as parameter default 2. **AIConfig.max_tokens = 4096** - Schema default when not specified in config 3. **.memdocs.yml: max_tokens: 8192** - User configuration (overrides schema default) ### Benefits - Users can now control response length via .memdocs.yml - Flexibility for different use cases (short summaries vs detailed docs) - Maintains safe defaults while allowing customization - Aligns code behavior with documented configuration options ### Example Usage ```python # Use default (4096 tokens) summarizer = Summarizer() # Use custom value summarizer = Summarizer(max_tokens=8192) # Via configuration file (.memdocs.yml sets max_tokens: 8192) # CLI automatically respects config value ``` 🤖 Generated with [Claude Code](https://claude.com/claude-code) Co-Authored-By: Claude <[email protected]>
1 parent d3793d1 commit 447f65f

File tree

2 files changed

+16
-4
lines changed

2 files changed

+16
-4
lines changed

memdocs/cli_modules/commands/review_cmd.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -169,7 +169,7 @@ def review(
169169

170170
# Summarize with AI
171171
with out.spinner("Generating documentation with Claude Sonnet 4.5"):
172-
summarizer = Summarizer(model=doc_config.ai.model)
172+
summarizer = Summarizer(model=doc_config.ai.model, max_tokens=doc_config.ai.max_tokens)
173173
doc_index, markdown_summary = summarizer.summarize(context, scope)
174174

175175
out.success("Documentation generated")

memdocs/summarize.py

Lines changed: 15 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,15 @@
2525

2626

2727
class Summarizer:
28-
"""AI-powered documentation summarizer."""
28+
"""AI-powered documentation summarizer.
29+
30+
Attributes:
31+
api_key: Anthropic API key
32+
model: Claude model name
33+
max_tokens: Maximum tokens for API responses
34+
client: Anthropic API client
35+
rate_limiter: Rate limiter for API calls
36+
"""
2937

3038
PROMPT_TEMPLATE = """You are a technical documentation AI generating machine-readable docs.
3139
@@ -94,12 +102,13 @@ class Summarizer:
94102
95103
Generate the YAML now:"""
96104

97-
def __init__(self, api_key: str | None = None, model: str = "claude-sonnet-4-5-20250929"):
105+
def __init__(self, api_key: str | None = None, model: str = "claude-sonnet-4-5-20250929", max_tokens: int = DEFAULT_MAX_TOKENS):
98106
"""Initialize summarizer.
99107
100108
Args:
101109
api_key: Anthropic API key (defaults to ANTHROPIC_API_KEY env var)
102110
model: Claude model to use
111+
max_tokens: Maximum tokens for Claude API response (default: 4096)
103112
"""
104113
self.api_key = api_key or os.environ.get("ANTHROPIC_API_KEY")
105114

@@ -115,6 +124,9 @@ def __init__(self, api_key: str | None = None, model: str = "claude-sonnet-4-5-2
115124
except Exception as e:
116125
raise ValueError(str(e)) from e
117126

127+
# Store max_tokens
128+
self.max_tokens = max_tokens
129+
118130
self.client = anthropic.Anthropic(api_key=self.api_key)
119131

120132
# Initialize rate limiter (50 calls per minute)
@@ -144,7 +156,7 @@ def summarize(self, context: ExtractedContext, scope: ScopeInfo) -> tuple[Docume
144156
# Call Claude
145157
response = self.client.messages.create(
146158
model=self.model,
147-
max_tokens=DEFAULT_MAX_TOKENS,
159+
max_tokens=self.max_tokens,
148160
messages=[
149161
{
150162
"role": "user",

0 commit comments

Comments
 (0)