From 367b59a15cab84e6f25c2e65a26833f7019511c7 Mon Sep 17 00:00:00 2001 From: Pierrick HYMBERT Date: Tue, 20 Feb 2024 22:45:30 +0100 Subject: [PATCH] server: tests: check for infinite loops --- examples/server/tests/features/server.feature | 11 ++++++++--- 1 file changed, 8 insertions(+), 3 deletions(-) diff --git a/examples/server/tests/features/server.feature b/examples/server/tests/features/server.feature index 5a580e5f8..c4d821d74 100644 --- a/examples/server/tests/features/server.feature +++ b/examples/server/tests/features/server.feature @@ -54,6 +54,10 @@ Feature: llama.cpp server It was her greeting to Prince Vassily, a man high in rank and office, who was the first to arrive at her soirée. """ + And a prompt: + """ + Write another very long music lyrics. + """ Given concurrent completion requests Then the server is busy And all slots are busy @@ -65,7 +69,7 @@ Feature: llama.cpp server Scenario: Multi users OAI Compatibility Given a system prompt "You are an AI assistant." And a model tinyllama-2 - And 1024 max tokens to predict + And 512 max tokens to predict And streaming is enabled Given a prompt: """ @@ -77,11 +81,12 @@ Feature: llama.cpp server """ And a prompt: """ - Write yet another very long music lyrics. + I believe the meaning of life is """ Given concurrent OAI completions requests Then the server is busy And all slots are busy Then the server is idle And all slots are idle - Then all prompts are predicted \ No newline at end of file + Then all prompts are predicted +