Readme: Add a entry for simplechat in the http server section
This commit is contained in:
parent
48f02e0b5c
commit
009563d1d7
1 changed files with 2 additions and 0 deletions
|
@ -150,6 +150,8 @@ Typically finetunes of the base models below are supported as well.
|
||||||
|
|
||||||
[llama.cpp web server](./examples/server) is a lightweight [OpenAI API](https://github.com/openai/openai-openapi) compatible HTTP server that can be used to serve local models and easily connect them to existing clients.
|
[llama.cpp web server](./examples/server) is a lightweight [OpenAI API](https://github.com/openai/openai-openapi) compatible HTTP server that can be used to serve local models and easily connect them to existing clients.
|
||||||
|
|
||||||
|
[simplechat](./examples/server/public_simplechat) is a simple chat client, which can be used to chat with the model exposed using above web server, from a local web browser.
|
||||||
|
|
||||||
**Bindings:**
|
**Bindings:**
|
||||||
|
|
||||||
- Python: [abetlen/llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
|
- Python: [abetlen/llama-cpp-python](https://github.com/abetlen/llama-cpp-python)
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue