vulkan install instructions git bash mingw64

This commit is contained in:
FirstTimeEZ 2024-11-15 19:47:58 +13:00 committed by GitHub
parent 4802ad350b
commit 4714ec3fa8
No known key found for this signature in database
GPG key ID: B5690EEEBB952194

View file

@ -280,6 +280,46 @@ The following compilation options are also available to tweak performance (yes,
**Windows**
#### Git Bash MINGW64
[![](https://i.imgur.com/M8dnyk5.png)](https://git-scm.com/downloads/win).
Download and install [`Git-SCM`](https://git-scm.com/downloads/win) with the default settings
[![](https://i.imgur.com/hiAMq2S.png)](https://visualstudio.microsoft.com/)
Download and install [`Visual Studio Community Edition`](https://visualstudio.microsoft.com/) and make sure you select `C++`
[![](https://i.imgur.com/piiceV6.png)](https://cmake.org/download/)
Download and install [`CMake`](https://cmake.org/download/) with the default settings
[![](https://i.imgur.com/cN0LDwJ.png)](https://vulkan.lunarg.com/sdk/home#windows)
Download and install the [`Vulkan SDK`](https://vulkan.lunarg.com/sdk/home#windows) with the default settings.
![](https://i.imgur.com/qTl7jbF.png)
Go into your `llama.cpp` directory and right click, select `Open Git Bash Here` and then run the following commands
![](https://i.imgur.com/7UQ7Rf1.png)
```
cmake -B build -DGGML_VULKAN=ON
```
![](https://i.imgur.com/GlJnitl.png)
```
cmake --build build --config Release
```
Now you can load the model in conversation mode using `Vulkan`
```
build/bin/release/llama-cli -m "[PATH TO MODEL]" -ngl 100 -c 16384 -t 10 -n -2 -cnv
```
#### w64devkit
Download and extract [w64devkit](https://github.com/skeeto/w64devkit/releases).