llama.cpp/requirements
2024-07-06 11:18:03 -04:00
..
requirements-convert_hf_to_gguf.txt py : use cpu-only torch in requirements.txt 2024-07-06 11:18:03 -04:00
requirements-convert_hf_to_gguf_update.txt py : use cpu-only torch in requirements.txt 2024-07-06 11:18:03 -04:00
requirements-convert_legacy_llama.txt py : switch to snake_case (#8305) 2024-07-05 07:53:33 +03:00
requirements-convert_llama_ggml_to_gguf.txt py : switch to snake_case (#8305) 2024-07-05 07:53:33 +03:00