Skip to content
This repository was archived by the owner on May 20, 2025. It is now read-only.

Commit 13923cd

Browse files
committed
fix install instructions.
1 parent 632608a commit 13923cd

File tree

1 file changed

+10
-8
lines changed

1 file changed

+10
-8
lines changed

docs/guides/deno/byo-deep-research.mdx

Lines changed: 10 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -34,14 +34,14 @@ Before we start implementing our research system, let's set up the project and i
3434

3535
1. **Create a new Nitric project**:
3636

37-
```bash
38-
# Install Nitric CLI if you haven't already
39-
npm install -g @nitric/cli
37+
If you haven't already install Nitric CLI by following the [official installation guide](https://nitric-docs-git-docs-byo-deep-research-nitrictech.vercel.app/docs/get-started/installation)
4038

41-
# Create a new Nitric project
42-
nitric new deep-research
43-
cd deep-research
44-
```
39+
Then create a new Nitric project with:
40+
41+
```bash
42+
nitric new deep-research
43+
cd deep-research
44+
```
4545

4646
2. **Configure dependencies in deno.json**:
4747

@@ -646,7 +646,7 @@ Production deployment is as simple as updating these environment variables:
646646

647647
To test the system locally:
648648

649-
1. **Install and Start Ollama**:
649+
1. **Install and Start Ollama (optional)**:
650650

651651
First, [install Ollama](https://ollama.ai/) for your operating system.
652652

@@ -657,6 +657,8 @@ To test the system locally:
657657
ollama serve
658658
```
659659

660+
> You can skip this step if you want to use OpenAI or other hosted solution as your LLM provider.
661+
660662
2. **Configure Environment**:
661663

662664
Create a `.env` file with local testing configuration:

0 commit comments

Comments
 (0)