@@ -206,6 +206,10 @@ Options:
206
206
ronment variable [string] [default: "latest"]
207
207
-a, --arch The architecture to compile llama.cpp for [string]
208
208
-t, --nodeTarget The Node.js version to compile llama.cpp for. Example: v18.0.0 [string]
209
+ --metal Compile llama.cpp with Metal support. Can also be set via the NODE_LLAMA_CP
210
+ P_METAL environment variable [boolean] [default: false]
211
+ --cuda Compile llama.cpp with CUDA support. Can also be set via the NODE_LLAMA_CPP
212
+ _CUDA environment variable [boolean] [default: false]
209
213
--skipBuild, --sb Skip building llama.cpp after downloading it [boolean] [default: false]
210
214
-v, --version Show version number [boolean]
211
215
```
@@ -220,6 +224,10 @@ Options:
220
224
-h, --help Show help [boolean]
221
225
-a, --arch The architecture to compile llama.cpp for [string]
222
226
-t, --nodeTarget The Node.js version to compile llama.cpp for. Example: v18.0.0 [string]
227
+ --metal Compile llama.cpp with Metal support. Can also be set via the NODE_LLAMA_CPP_MET
228
+ AL environment variable [boolean] [default: false]
229
+ --cuda Compile llama.cpp with CUDA support. Can also be set via the NODE_LLAMA_CPP_CUDA
230
+ environment variable [boolean] [default: false]
223
231
-v, --version Show version number [boolean]
224
232
```
225
233
0 commit comments