From da4a773cbcb84e7a9181b475e2239b83eda46cab Mon Sep 17 00:00:00 2001 From: Georgi Gerganov Date: Tue, 18 Jul 2023 13:30:26 +0300 Subject: [PATCH] ci : add README.md --- ci/README.md | 19 +++++++++++++++++++ 1 file changed, 19 insertions(+) create mode 100644 ci/README.md diff --git a/ci/README.md b/ci/README.md new file mode 100644 index 000000000..9ede632ec --- /dev/null +++ b/ci/README.md @@ -0,0 +1,19 @@ +# CI + +In addition to [Github Actions](https://github.com/ggerganov/llama.cpp/actions) `llama.cpp` uses a custom CI framework: + +https://github.com/ggml-org/ci + +It monitors the `master` branch for new commits and runs the [[ci/run.sh]] script on dedicated cloud instances. This +allows us to execute heavier workloads compared to just using Github Actions. Also with time, the cloud instances will +be scaled to cover various hardware architectures, including GPU and Apple Silicon instances. + +Collaborators can optionally trigger the CI run by adding the `ggml-ci` keyword to their commit message. +Only the branches of this repo are monitored for this keyword. + +It is a good practice, before publishing changes to execute the full CI locally on your machine: + +```bash +mkdir tmp +bash ./ci/run.sh ./tmp/results ./tmp/mnt +```