Remove duplicate OAI instructions
This commit is contained in:
parent
1dd61d2aa3
commit
775a299a4b
1 changed files with 0 additions and 16 deletions
|
@ -208,22 +208,6 @@ openai.api_base = "http://<Your api-server IP>:port"
|
||||||
|
|
||||||
Then you can utilize llama.cpp as an OpenAI's **chat.completion** or **text_completion** API
|
Then you can utilize llama.cpp as an OpenAI's **chat.completion** or **text_completion** API
|
||||||
|
|
||||||
### API like OAI
|
|
||||||
|
|
||||||
API example using Python Flask: [api_like_OAI.py](api_like_OAI.py)
|
|
||||||
This example must be used with server.cpp
|
|
||||||
|
|
||||||
```sh
|
|
||||||
python api_like_OAI.py
|
|
||||||
```
|
|
||||||
|
|
||||||
After running the API server, you can use it in Python by setting the API base URL.
|
|
||||||
```python
|
|
||||||
openai.api_base = "http://<Your api-server IP>:port"
|
|
||||||
```
|
|
||||||
|
|
||||||
Then you can utilize llama.cpp as an OpenAI's **chat.completion** or **text_completion** API
|
|
||||||
|
|
||||||
### Extending the Web Front End
|
### Extending the Web Front End
|
||||||
|
|
||||||
The default location for the static files is `examples/server/public`. You can extend the front end by running the server binary with `--path` set to `./your-directory` and importing `/completion.js` to get access to the llamaComplete() method. A simple example is below:
|
The default location for the static files is `examples/server/public`. You can extend the front end by running the server binary with `--path` set to `./your-directory` and importing `/completion.js` to get access to the llamaComplete() method. A simple example is below:
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue