From 1ecda0d13eb44793b8926360ad2e29814bdbb86b Mon Sep 17 00:00:00 2001 From: Pierrick HYMBERT Date: Tue, 20 Feb 2024 23:35:44 +0100 Subject: [PATCH] server: tests: disable issue 3969 scenario --- examples/server/tests/features/server.feature | 9 +++++++-- examples/server/tests/tests.sh | 2 +- 2 files changed, 8 insertions(+), 3 deletions(-) diff --git a/examples/server/tests/features/server.feature b/examples/server/tests/features/server.feature index 77e8b9088..df376b0f2 100644 --- a/examples/server/tests/features/server.feature +++ b/examples/server/tests/features/server.feature @@ -5,11 +5,13 @@ Feature: llama.cpp server Then the server is starting Then the server is healthy + @llama.cpp Scenario: Health When the server is healthy Then the server is ready And all slots are idle + @llama.cpp Scenario Outline: Completion Given a completion request with maximum tokens Then tokens are predicted @@ -19,6 +21,7 @@ Feature: llama.cpp server | I believe the meaning of life is | 128 | 128 | | Write a joke about AI | 512 | 512 | + @llama.cpp Scenario Outline: OAI Compatibility Given a system prompt And a user prompt @@ -33,6 +36,7 @@ Feature: llama.cpp server | llama-2 | You are ChatGPT. | Say hello. | 64 | false | 64 | | codellama70b | You are a coding assistant. | Write the fibonacci function in c++. | 512 | true | 512 | + @llama.cpp Scenario: Multi users Given a prompt: """ @@ -50,7 +54,7 @@ Feature: llama.cpp server And all slots are idle Then all prompts are predicted - + @llama.cpp Scenario: Multi users OAI Compatibility Given a system prompt "You are an AI assistant." And a model tinyllama-2 @@ -71,7 +75,8 @@ Feature: llama.cpp server And all slots are idle Then all prompts are predicted - # FIXME: infinite loop on the CI, not locally, if n_prompt * n_predict > kv_size + # FIXME: #3969 infinite loop on the CI, not locally, if n_prompt * n_predict > kv_size + @bug Scenario: Multi users with total number of tokens to predict exceeds the KV Cache size Given a prompt: """ diff --git a/examples/server/tests/tests.sh b/examples/server/tests/tests.sh index 230ee45ad..52908b839 100755 --- a/examples/server/tests/tests.sh +++ b/examples/server/tests/tests.sh @@ -32,4 +32,4 @@ set -eu "$@" & # Start tests -behave --summary --stop +behave --summary --stop --tags llama.cpp