From c5a0d57ee5013a194ea6c4ebced14895cac8bb11 Mon Sep 17 00:00:00 2001 From: ochafik Date: Sun, 29 Sep 2024 19:37:23 +0100 Subject: [PATCH] Update cancel.feature --- examples/server/tests/features/cancel.feature | 2 -- 1 file changed, 2 deletions(-) diff --git a/examples/server/tests/features/cancel.feature b/examples/server/tests/features/cancel.feature index 241507024..e7753b5dd 100644 --- a/examples/server/tests/features/cancel.feature +++ b/examples/server/tests/features/cancel.feature @@ -18,10 +18,8 @@ Feature: Cancellation of llama.cpp server requests And 64 server max tokens to predict And prometheus compatible metrics exposed And 300 milliseconds delay in sampler for testing - And no warmup Then the server is starting Then the server is healthy - # Then the server is healthy with timeout 10 seconds Scenario Outline: Cancelling an OAI chat completion request frees up slot (streaming )