Skip to content

Server: Add prompt processing progress endpoint? #6586

Open
@stduhpf

Description

@stduhpf

Feature Description

It would be nice to have an endpoint on the server example to fetch information about the progress of an ongoing prompt processing It could return something like this:

{
    "processing": [true|false]
    "prompt_length": [number of uncached tokens of the last prompt]
    "remaining": [number of tokens yet to be processed]
}

Motivation

For longer prompts, or when the processing speed is very slow, it would be nice to get a clue about the advencement of the prompt processing. This would possibly also be useful for other projects, not just the server.

Possible Implementation

I haven't yet looked too deep in the current server implementation, so I can't really tell how this would work, but I imagine it would require some deeper changes in the backend too.
I did add a simillar feature on a very old project based on an ancient version of llama.cpp, a year ago: stduhpf/fastLLaMa@1ebd5ba This is now very much outdated, but this feature was nice to have.

Metadata

Metadata

Assignees

No one assigned

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions