@@ -172,11 +172,11 @@ const OAI = new OpenAI({
172
172
// LLM_MODEL=llama3.2:3b # For local testing
173
173
```
174
174
175
- ### Prompt Templates
175
+ ### 3. Prompt Templates
176
176
177
177
The system uses three types of LLM calls, each with its own prompt template:
178
178
179
- 1 . ** Query Generation** (` queryPrompt ` ): Generates search queries based on the research topic
179
+ #### 1. ** Query Generation** (` queryPrompt ` ): Generates search queries based on the research topic
180
180
181
181
``` typescript title:prompts/query.ts
182
182
export default (date : string , topic : string ) => `
@@ -209,7 +209,7 @@ Provide your response only in JSON format:
209
209
`
210
210
```
211
211
212
- 2 . ** Content Summarization** (` summarizerPrompt ` ): Summarizes content in the context of research topics
212
+ #### 2. ** Content Summarization** (` summarizerPrompt ` ): Summarizes content in the context of research topics
213
213
214
214
``` typescript title:prompts/summarizer.ts
215
215
export default (topics : string []) => ` <GOAL>
@@ -237,12 +237,12 @@ When EXTENDING an existing summary:
237
237
- Start directly with the updated summary, without preamble or titles. Do not use XML tags in the output.
238
238
</FORMATTING>
239
239
240
- <Task >
240
+ <TASK >
241
241
Think carefully about the provided Context first. Then generate a summary of the context to address the User Input.
242
- </Task > `
242
+ </TASK > `
243
243
```
244
244
245
- 3 . ** Research Reflection** (` reflectionPrompt ` ): Analyzes current research to identify knowledge gaps
245
+ #### 3. ** Research Reflection** (` reflectionPrompt ` ): Analyzes current research to identify knowledge gaps
246
246
247
247
``` typescript title:prompts/reflect.ts
248
248
export default (topics : string []) => `
@@ -273,7 +273,7 @@ These prompts work together to create a research system that:
273
273
- Attempts to identify knowledge gaps
274
274
- Continues research if required
275
275
276
- ### 3 . Topic-based Research Pipeline
276
+ ### 4 . Topic-based Research Pipeline
277
277
278
278
The system uses a topic-based approach to conduct research, with different message types for each stage of the process:
279
279
@@ -312,7 +312,7 @@ interface ReflectTopicMessage extends ResearchTopicMessage<'reflect'> {
312
312
}
313
313
```
314
314
315
- ### 4 . Content Processing
315
+ ### 5 . Content Processing
316
316
317
317
The system includes HTML cleaning and markdown conversion:
318
318
@@ -355,7 +355,7 @@ function cleanHtml(html: string): string {
355
355
}
356
356
```
357
357
358
- ### 5 . The Nitric API Implementation
358
+ ### 6 . The Nitric API Implementation
359
359
360
360
The main API implementation ties everything together:
361
361
0 commit comments