File tree Expand file tree Collapse file tree 1 file changed +40
-0
lines changed Expand file tree Collapse file tree 1 file changed +40
-0
lines changed Original file line number Diff line number Diff line change @@ -280,6 +280,46 @@ The following compilation options are also available to tweak performance (yes,
280280
281281**Windows**
282282
283+ #### Git Bash MINGW64
284+
285+ [](https://git-scm.com/downloads/win).
286+
287+ Download and install [`Git-SCM`](https://git-scm.com/downloads/win) with the default settings
288+
289+ [](https://visualstudio.microsoft.com/)
290+
291+ Download and install [`Visual Studio Community Edition`](https://visualstudio.microsoft.com/) and make sure you select `C++`
292+
293+ [](https://cmake.org/download/)
294+
295+ Download and install [`CMake`](https://cmake.org/download/) with the default settings
296+
297+ [](https://vulkan.lunarg.com/sdk/home#windows)
298+
299+ Download and install the [`Vulkan SDK`](https://vulkan.lunarg.com/sdk/home#windows) with the default settings.
300+
301+ 
302+
303+ Go into your `llama.cpp` directory and right click, select `Open Git Bash Here` and then run the following commands
304+
305+ 
306+
307+ ```
308+ cmake -B build -DGGML_VULKAN=ON
309+ ```
310+
311+ 
312+
313+ ```
314+ cmake --build build --config Release
315+ ```
316+
317+ Now you can load the model in conversation mode using `Vulkan`
318+
319+ ```
320+ build/bin/release/llama-cli -m "[PATH TO MODEL]" -ngl 100 -c 16384 -t 10 -n -2 -cnv
321+ ```
322+
283323#### w64devkit
284324
285325Download and extract [w64devkit](https://github.com/skeeto/w64devkit/releases).
You can’t perform that action at this time.
0 commit comments