Remove duplicate OAI instructions

This commit is contained in:
Jesse Johnson 2023-07-05 16:00:22 +00:00
parent 1dd61d2aa3
commit 775a299a4b

View file

@ -208,22 +208,6 @@ openai.api_base = "http://<Your api-server IP>:port"
Then you can utilize llama.cpp as an OpenAI's **chat.completion** or **text_completion** API
### API like OAI
API example using Python Flask: [api_like_OAI.py](api_like_OAI.py)
This example must be used with server.cpp
```sh
python api_like_OAI.py
```
After running the API server, you can use it in Python by setting the API base URL.
```python
openai.api_base = "http://<Your api-server IP>:port"
```
Then you can utilize llama.cpp as an OpenAI's **chat.completion** or **text_completion** API
### Extending the Web Front End
The default location for the static files is `examples/server/public`. You can extend the front end by running the server binary with `--path` set to `./your-directory` and importing `/completion.js` to get access to the llamaComplete() method. A simple example is below: