-
Notifications
You must be signed in to change notification settings - Fork 14.2k
Open
Labels
enhancementNew feature or requestNew feature or requestroadmapPart of a roadmap projectPart of a roadmap project
Description
Name and Version
.\llama-server --version
...
version: 5902 (4a4f426)
built with clang version 19.1.5 for x86_64-pc-windows-msvc
Operating systems
Windows
Which llama.cpp modules do you know to be affected?
llama-server
Command line
.\llama-server -m Llama-3.2-3B-Instruct-Q6_K_L.gguf -ngl 33 --port 8081 --host 0.0.0.0Problem description & steps to reproduce
When OpenAI compatible API is used and client uses v1/responses I get 404
Possibly not yet supported?
ref:
https://platform.openai.com/docs/api-reference/responses
First Bad Commit
Not sure
Relevant log output
Client
`POST "http://192.168.x.x:8081/v1/responses": 404 Not Found {"code":404,"message":"File Not Found","type":"not_found_error"`
Server
main: server is listening on http://0.0.0.0:8081 - starting the main loop
srv update_slots: all slots are idle
srv log_server_r: request: POST /v1/responses 192.168.x.x 404junjihashimoto, mirekphd, ReayrtNyGit, bfroemel, cjxsn and 5 more
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestroadmapPart of a roadmap projectPart of a roadmap project