File tree Expand file tree Collapse file tree 1 file changed +2
-1
lines changed
Expand file tree Collapse file tree 1 file changed +2
-1
lines changed Original file line number Diff line number Diff line change @@ -66,12 +66,13 @@ Then just update your settings in `.vscode/settings.json` to point to your code
6666### Function Calling
6767
6868` llama-cpp-python ` supports structured function calling based on a JSON schema.
69+ Function calling is completely compatible with the OpenAI function calling API and can be used by connecting with the official OpenAI Python client.
6970
7071You'll first need to download one of the available function calling models in GGUF format:
7172
7273- [ functionary-7b-v1] ( https://huggingface.co/abetlen/functionary-7b-v1-GGUF )
7374
74- Then when you run the server you'll need to also specify the ` functionary-7b-v1 ` chat_format
75+ Then when you run the server you'll need to also specify the ` functionary ` chat_format
7576
7677``` bash
7778python3 -m llama_cpp.server --model < model_path> --chat_format functionary
You can’t perform that action at this time.
0 commit comments