Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[SUPPORT] function call usage #1301

Open
FanZhang91 opened this issue Jul 22, 2024 · 1 comment
Open

[SUPPORT] function call usage #1301

FanZhang91 opened this issue Jul 22, 2024 · 1 comment
Labels
support Questions about how to do something

Comments

@FanZhang91
Copy link

environment:ubuntu=20.04 + transformers=4.42.4 + openai=1.30.5 + vllm=0.5.2

I use vllm as server and openai as client and using similar code from website:https://github.com/openai/openai-cookbook/blob/main/examples/How_to_call_functions_with_chat_models.ipynb

but result show no function info to call. how to fix this problem?

---------------------------------------------------- client code info ---------------------------------------------------------
from openai import OpenAI
import json

tools = [
{
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location.",
},
},
"required": ["location", "format"],
},
}
},
{
"type": "function",
"function": {
"name": "get_n_day_weather_forecast",
"description": "Get an N-day weather forecast",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "The city and state, e.g. San Francisco, CA",
},
"format": {
"type": "string",
"enum": ["celsius", "fahrenheit"],
"description": "The temperature unit to use. Infer this from the users location.",
},
"num_days": {
"type": "integer",
"description": "The number of days to forecast",
}
},
"required": ["location", "format", "num_days"]
},
}
},
]

openai_api_key = "xxx"
openai_api_base = "http://localhost:8000/v1/"

client = OpenAI(
api_key=openai_api_key,
base_url=openai_api_base,
)

models = client.models.list()
model = models.data[0].id

messages = []
messages.append({"role": "system", "content": "Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."})
messages.append({"role": "user", "content": "What's the weather like today"})
response = client.chat.completions.create(
model=model,
messages=messages,
tools=tools,
)
assistant_message = response.choices[0].message
messages.append(assistant_message)
print("response: ", assistant_message)

---------------------------------------------- server script-------------------------------------------------
python entrypoints/openai/api_server.py --model="xxxx/Qwen2-1.5B-Instruct" --trust-remote-code --host "localhost" --port 8000 --dtype auto

--------------------------------------------- print info --------------------------------------------------------
response: ChatCompletionMessage(content='Get out and check.', role='assistant', function_call=None, tool_calls=[])

@FanZhang91 FanZhang91 added the support Questions about how to do something label Jul 22, 2024
@FanZhang91 FanZhang91 changed the title [SUPPORT] funciton call usage [SUPPORT] function call usage Jul 22, 2024
@nelsonauner
Copy link
Contributor

@FanZhang91 Please format your code with triple backticks.
What GPT model are you using? Your invocation includes --model="xxxx/Qwen2-1.5B-Instruct" which of course isn't an OpenAI model at all.

Please provide a more simple code sample that doesn't use an API server

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
support Questions about how to do something
Projects
None yet
Development

No branches or pull requests

2 participants