server : Add tokenize with pieces tests to server.feature

This commit is contained in:
Mathijs Henquet 2024-08-22 00:04:21 +02:00
parent a2d4d1913c
commit 198daa4e34
2 changed files with 10 additions and 2 deletions

View file

@ -104,7 +104,15 @@ Feature: llama.cpp server
Then tokens begin with BOS Then tokens begin with BOS
Given first token is removed Given first token is removed
Then tokens can be detokenized Then tokens can be detokenized
Scenario: Tokenize with pieces
When tokenizing with pieces:
"""
What is the capital of Germany?
"""
Then tokens are given with pieces
Scenario: Models available Scenario: Models available
Given available models Given available models
Then 1 models are supported Then 1 models are supported

View file

@ -702,7 +702,7 @@ async def step_tokenize_with_pieces(context):
context.tokens_with_pieces = tokenize_json["tokens"] context.tokens_with_pieces = tokenize_json["tokens"]
@step("tokens with pieces are complete") @step("tokens are given with pieces")
@async_run_until_complete @async_run_until_complete
async def step_tokenize_with_pieces(context): async def step_tokenize_with_pieces(context):
# Verify that the response contains both token IDs and pieces # Verify that the response contains both token IDs and pieces