Update issues.feature
This commit is contained in:
parent
cbbd1efa06
commit
a402d3cf74
1 changed files with 13 additions and 1 deletions
|
@ -1,4 +1,16 @@
|
||||||
# List of ongoing issues
|
# List of ongoing issues
|
||||||
@bug
|
@bug
|
||||||
Feature: Issues
|
Feature: Issues llama.cpp server
|
||||||
# No confirmed issue at the moment
|
# No confirmed issue at the moment
|
||||||
|
Background: Server startup
|
||||||
|
Given a server listening on localhost:8080
|
||||||
|
And a model with n_embed=4096
|
||||||
|
And n_ctx=32768
|
||||||
|
And 8 slots
|
||||||
|
And embeddings extraction
|
||||||
|
Then the server is starting
|
||||||
|
Then the server is healthy
|
||||||
|
|
||||||
|
Scenario: Embedding
|
||||||
|
When 8 identical inputs (1000 tokens) are computed simultaneously.
|
||||||
|
Then embeddings are generated, but they are different
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue