llama.cpp/ggml
Johannes Gäßler 432df2d5f9
RoPE: fix back, CUDA support for back + noncont. (#11240)
* RoPE: fix back, CUDA support for back + noncont.

* fix comments reg. non-cont. RoPE support [no-ci]
2025-01-15 12:51:37 +01:00
..
include RoPE: fix back, CUDA support for back + noncont. (#11240) 2025-01-15 12:51:37 +01:00
src RoPE: fix back, CUDA support for back + noncont. (#11240) 2025-01-15 12:51:37 +01:00
.gitignore vulkan : cmake integration (#8119) 2024-07-13 18:12:39 +02:00
CMakeLists.txt GGUF: C++ refactor, backend support, misc fixes (#11030) 2025-01-07 18:01:58 +01:00