Commit 8f3dbaf
Fix AI narration truncation: reduce chunk size to 8, increase maxTokens to 8192
Chunk 1 returned 9 of 10 expected items — the response was truncated at
4096 tokens. Reducing chunk size from 10 to 8 and doubling maxTokens to
8192 gives adequate headroom for the conversational prompt style which
produces longer narratives per discovery.
Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>1 parent fc7ebac commit 8f3dbaf
1 file changed
+2
-2
lines changed| Original file line number | Diff line number | Diff line change | |
|---|---|---|---|
| |||
1037 | 1037 | | |
1038 | 1038 | | |
1039 | 1039 | | |
1040 | | - | |
| 1040 | + | |
1041 | 1041 | | |
1042 | 1042 | | |
1043 | 1043 | | |
| |||
1069 | 1069 | | |
1070 | 1070 | | |
1071 | 1071 | | |
1072 | | - | |
| 1072 | + | |
1073 | 1073 | | |
1074 | 1074 | | |
1075 | 1075 | | |
| |||
0 commit comments