File tree Expand file tree Collapse file tree 1 file changed +17
-0
lines changed Expand file tree Collapse file tree 1 file changed +17
-0
lines changed Original file line number Diff line number Diff line change @@ -175,6 +175,23 @@ console.log("AI: " + a2);
175
175
console .log (JSON .parse (a2 ));
176
176
```
177
177
178
+ ### Metal and CUDA support
179
+ To load a version of ` llama.cpp ` that was compiled to use Metal or CUDA,
180
+ you have to build it from source with the ` --metal ` or ` --cuda ` flag before running your code that imports ` node-llama-cpp ` .
181
+
182
+ To do this, run this command inside of your project directory:
183
+ ``` bash
184
+ # For Metal support on macOS
185
+ npx node-llama-cpp download --metal
186
+
187
+ # For CUDA support
188
+ npx node-llama-cpp download --cuda
189
+ ```
190
+
191
+ > In order for ` node-llama-cpp ` to be able to build ` llama.cpp ` from source, make sure you have the required dependencies of ` node-gyp ` installed.
192
+ >
193
+ > More info is available [ here] ( https://github.com/nodejs/node-gyp#on-unix ) (you don't have to install ` node-gyp ` itself, just the dependencies).
194
+
178
195
### CLI
179
196
```
180
197
Usage: node-llama-cpp <command> [options]
You can’t perform that action at this time.
0 commit comments