Skip to content
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
44 changes: 44 additions & 0 deletions docs/cody/faq.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -136,3 +136,47 @@ cody chat --model '$name_of_the_model' -m 'Hi Cody!'
```

For example, to use Claude 3.5 Sonnet, you'd pass the following command in your terminal, `cody chat --model 'claude-3.5-sonnet' -m 'Hi Cody!'

## OpenAI o1

### What are OpenAI o1 best practices?

#### Context Management

- Provide focused, relevant context
- Use file references strategically
- Keep initial requests concise

#### Model Selection

- **o1-preview**: Best for complex reasoning and planning
- **o1-mini**: More reliable for straightforward tasks
- **Sonnet 3.5**: Better for tasks requiring longer outputs

#### Prompting Strategy

- Be specific and direct
- Include complete requirements upfront
- Request brief responses when possible
- Use step-by-step approach for complex tasks

#### Response Time

- Start with smaller contexts
- Use o1-mini for faster responses
- Consider breaking complex tasks into stages

#### Quality

- Provide clear acceptance criteria
- Include relevant context but avoid excess
- Use specific examples when possible

### What are the known limitations?

#### Technical Constraints

- 45k input token limit
- 4k output token limit
- No streaming responses for o1 series
- Limited context window
Loading