You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: docs/NODE_GUIDE.md
+33-33Lines changed: 33 additions & 33 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -6,24 +6,16 @@ Running a Dria Compute Node is pretty straightforward.
6
6
7
7
### Software
8
8
9
-
You need the following applications to run compute node:
9
+
You only **Docker**to run the node! You can check if you have it by printing its version:
10
10
11
-
-**Git**: We will use `git` to clone the repository from GitHub, and pull latest changes for updates later.
12
-
-**Docker**: Our services will make use of Docker so that the node can run on any machine.
11
+
```sh
12
+
docker -v
13
+
```
13
14
14
15
> [!CAUTION]
15
16
>
16
17
> > In **Windows** machines, Docker Desktop is requried to be running with **WSL2**. You can check the Docker Desktop Windows installation guide from [here](https://docs.docker.com/desktop/install/windows-install/)
17
18
18
-
> [!TIP]
19
-
>
20
-
> You can check if you have these via:
21
-
>
22
-
> ```sh
23
-
> which git
24
-
> which docker
25
-
>```
26
-
27
19
### Hardware
28
20
29
21
**To learn about hardware specifications such as required CPU and RAM, please refer to [node specifications](./NODE_SPECS.md).**
@@ -38,9 +30,9 @@ In general, if you are using Ollama you will need the memory to run large models
38
30
39
31
To be able to run a node, we need to make a few simple preparations. Follow the steps below one by one.
We have a [dkn-launcher](https://github.com/firstbatchxyz/dkn-compute-launcher) cli app for easily setting up the environment and running the compute node. We will install that first.
35
+
We have a [cross-platform node launcher](https://github.com/firstbatchxyz/dkn-compute-launcher)to easily set up the environment and running the compute node. We will install that first.
44
36
45
37
Download the appropriate ZIP file for your system using the commands below or from [browser](https://github.com/firstbatchxyz/dkn-compute-launcher/releases/tag/v0.0.1). Make sure to replace the URL with the correct version for your operating system and architecture.
46
38
@@ -123,15 +115,19 @@ Download the appropriate ZIP file for your system using the commands below or fr
123
115
124
116
### 2. Prepare Environment Variables
125
117
126
-
> [!TIP]
127
-
>
128
-
> Speed-running the node execution:
129
-
>
130
-
> Optionally, you can also handle the environment variables on the fly by just running the `dkn-compute-launcher` cli-app directly, since it'll ask you to enter the required environment variables.
131
-
>
132
-
> If you prefer this you can move on to the [Usage](#usage) section
118
+
With our launcher, setting up the environment variables happen on the fly by just running the `dkn-compute-launcher` CLI application directly, it'll ask you to enter the required environment variables if you don't have them! This way, you won't have to manually do the copying and creating environment variables yourself, and instead let the CLI do it for you.
119
+
120
+
If you prefer this method, you can move directly on to the [Usage](#usage) section. If you would like to do this part manually, you can continue reading this section.
121
+
122
+
#### Create `.env` File
133
123
134
-
Dria Compute Node makes use of several environment variables. We will fill out the missing parts witin `.env` file in a moment.
124
+
Dria Compute Node makes use of several environment variables. Let's create an `.env` file from the given example first.
125
+
126
+
```sh
127
+
cp .env.example .ev
128
+
```
129
+
130
+
We will fill out the missing parts witin `.env` file in a moment.
135
131
136
132
> [!NOTE]
137
133
>
@@ -153,15 +149,15 @@ Dria Compute Node makes use of several environment variables. We will fill out t
153
149
154
150
### 3. Prepare Ethereum Wallet
155
151
156
-
Dria makes use of the same Ethereum wallet, that is the recipient of your hard-earned rewards! Place your private key at `DKN_WALLET_SECRET_KEY` in `.env` without the 0x prefix. It should look something like:
152
+
Dria makes use of the same Ethereum wallet, that is the recipient of your hard-earned rewards! Place your private key at `DKN_WALLET_SECRET_KEY`in`.env` without the `0x` prefix. It should look something like:
> Always make sure your private key is within the .gitignore'd `.env` file, nowhere else! To be even safer, you can use a throwaway wallet, you can always transfer your rewards to a main wallet afterwards.
160
+
> Always make sure your private key is within the .gitignore'd `.env` file, nowhere else! To be even safer, you can use a throw-away wallet, you can always transfer your claimed rewards to a main wallet afterwards.
165
161
166
162
### 4. Setup LLM Provider
167
163
@@ -177,22 +173,26 @@ OPENAI_API_KEY=<YOUR_KEY>
177
173
178
174
#### For Ollama
179
175
180
-
Of course, first you have to install Ollama; see their [download page](https://ollama.com/download). Then, you must **first pull a small embedding model that is used internally**.
176
+
First you have to install Ollama, if you haven't already! See their [download page](https://ollama.com/download) and follow their instructions there. The models that we want to use have to be pulled to Ollama before we can use them.
177
+
178
+
> [!TIP]
179
+
>
180
+
> The compute node is set to download any missing model automatically at the start by default. This is enabled via the `OLLAMA_AUTO_PULL=true` in `.env`. If you would like to disable this feature, set `OLLAMA_AUTO_PULL=false` and then continue reading this section, otherwise you can skip to [optional services](#optional-services).
181
+
182
+
First, you must **first pull a small embedding model that is used internally**.
181
183
182
184
```sh
183
185
ollama pull hellord/mxbai-embed-large-v1:f16
184
186
```
185
187
186
-
For the models that you choose (see list of models just below [here](#1-choose-models)) you can download them with same command. Note that if your model size is large, pulling them may take a while.
188
+
For the models that you choose (see list of models just below [here](#1-choose-models)) you can download them with same command. Note that if your model size is large, pulling them may take a while. For example:
187
189
188
190
```sh
189
-
# example for phi3:3.8b
190
-
ollama pull phi3:3.8b
191
+
# example
192
+
ollama pull llama3.1:latest
191
193
```
192
194
193
195
> [!TIP]
194
-
>
195
-
> Alternatively, you can set`OLLAMA_AUTO_PULL=true`in the `.env` so that the compute node will always download the missing models for you.
196
196
197
197
#### Optional Services
198
198
@@ -216,11 +216,11 @@ Based on the resources of your machine, you must decide which models that you wi
0 commit comments