server: tests: add infinite loop scenario
This commit is contained in:
parent
68574c6f98
commit
b0b6d83c76
1 changed files with 2 additions and 2 deletions
|
@ -42,7 +42,7 @@ Feature: llama.cpp server
|
|||
"""
|
||||
Write another very long music lyrics.
|
||||
"""
|
||||
And 256 max tokens to predict
|
||||
And 32 max tokens to predict
|
||||
Given concurrent completion requests
|
||||
Then the server is busy
|
||||
And all slots are busy
|
||||
|
@ -62,7 +62,7 @@ Feature: llama.cpp server
|
|||
"""
|
||||
Write another very long music lyrics.
|
||||
"""
|
||||
And 256 max tokens to predict
|
||||
And 32 max tokens to predict
|
||||
And streaming is enabled
|
||||
Given concurrent OAI completions requests
|
||||
Then the server is busy
|
||||
|
|
Loading…
Add table
Add a link
Reference in a new issue