From 4714ec3fa8144c3d2e7c5f77a2983c5b95b9b193 Mon Sep 17 00:00:00 2001 From: FirstTimeEZ <179362031+FirstTimeEZ@users.noreply.github.com> Date: Fri, 15 Nov 2024 19:47:58 +1300 Subject: [PATCH] vulkan install instructions git bash mingw64 --- docs/build.md | 40 ++++++++++++++++++++++++++++++++++++++++ 1 file changed, 40 insertions(+) diff --git a/docs/build.md b/docs/build.md index 95512415a..f93b60f0f 100644 --- a/docs/build.md +++ b/docs/build.md @@ -280,6 +280,46 @@ The following compilation options are also available to tweak performance (yes, **Windows** +#### Git Bash MINGW64 + +[![](https://i.imgur.com/M8dnyk5.png)](https://git-scm.com/downloads/win). + +Download and install [`Git-SCM`](https://git-scm.com/downloads/win) with the default settings + +[![](https://i.imgur.com/hiAMq2S.png)](https://visualstudio.microsoft.com/) + +Download and install [`Visual Studio Community Edition`](https://visualstudio.microsoft.com/) and make sure you select `C++` + +[![](https://i.imgur.com/piiceV6.png)](https://cmake.org/download/) + +Download and install [`CMake`](https://cmake.org/download/) with the default settings + +[![](https://i.imgur.com/cN0LDwJ.png)](https://vulkan.lunarg.com/sdk/home#windows) + +Download and install the [`Vulkan SDK`](https://vulkan.lunarg.com/sdk/home#windows) with the default settings. + +![](https://i.imgur.com/qTl7jbF.png) + +Go into your `llama.cpp` directory and right click, select `Open Git Bash Here` and then run the following commands + +![](https://i.imgur.com/7UQ7Rf1.png) + +``` +cmake -B build -DGGML_VULKAN=ON +``` + +![](https://i.imgur.com/GlJnitl.png) + +``` +cmake --build build --config Release +``` + +Now you can load the model in conversation mode using `Vulkan` + +``` +build/bin/release/llama-cli -m "[PATH TO MODEL]" -ngl 100 -c 16384 -t 10 -n -2 -cnv +``` + #### w64devkit Download and extract [w64devkit](https://github.com/skeeto/w64devkit/releases).