From 009563d1d7e59476b43553fd024b8e80d5694aaf Mon Sep 17 00:00:00 2001 From: HanishKVC Date: Thu, 30 May 2024 00:54:38 +0530 Subject: [PATCH] Readme: Add a entry for simplechat in the http server section --- README.md | 2 ++ 1 file changed, 2 insertions(+) diff --git a/README.md b/README.md index 4791f84af..975eced17 100644 --- a/README.md +++ b/README.md @@ -150,6 +150,8 @@ Typically finetunes of the base models below are supported as well. [llama.cpp web server](./examples/server) is a lightweight [OpenAI API](https://github.com/openai/openai-openapi) compatible HTTP server that can be used to serve local models and easily connect them to existing clients. +[simplechat](./examples/server/public_simplechat) is a simple chat client, which can be used to chat with the model exposed using above web server, from a local web browser. + **Bindings:** - Python: [abetlen/llama-cpp-python](https://github.com/abetlen/llama-cpp-python)