
Welcome back!
In this concise second article, we’re exploring the efficient world of parallel function execution as highlighted in OpenAI DevDay. Let’s quickly dive into what makes it exciting !
Feature overview
Function calling has been updated. Now, you can call multiple functions with a single message. (To be precise, the response side instructs to call multiple functions.)
gpt-4-1106-preview and gpt-3.5-turbo-1106 can be used with.
It also appears that the accuracy of the function calling itself has been improved, making it more likely to return the correct function parameter
The official guide provides an example of parallel Function calling, such as retrieving the weather in three places.
Let’s try this out.
Setup:
- Python ‘openai‘ module installed
- Obtained OPENAI_API_KEY.
- For how to obtain it, refer to: https://kbilel.com/2023/11/19/for-beginner-i-tried-the-quickstart-tutorial-of-openai-api/
Tool definition
First, define the function you want to call as a tool.
import json
def get_current_weather(location, unit="celsius"):
"""Get the current weather in a given location"""
if "paris" in location.lower():
return json.dumps({"location": location, "temperature": "10", "unit": "celsius"})
elif "new york" in location.lower():
return json.dumps({"location": location, "temperature": "72", "unit": "fahrenheit"})
else:
return json.dumps({"location": location, "temperature": "22", "unit": "celsius"})
Next, define ‘tools‘ that include this function. (The data type will be List[ChatCompletionToolParam]).
from openai.types.chat import ChatCompletionToolParam
tools = [
ChatCompletionToolParam({
"type": "function",
"function": {
"name": "get_current_weather",
"description": "Get the current weather for the specified location",
"parameters": {
"type": "object",
"properties": {
"location": {
"type": "string",
"description": "Prefecture and city names (e.g. Paris, New York))",
},
"unit": {"type": "string", "enum": ["celsius", "fahrenheit"]},
},
"required": ["location"],
},
},
})
]
Note: The structure changes when using ‘functions‘ instead of ‘tools‘. In the case of ‘tools‘, there is an additional « type »: « function », and one more level of nesting.
Chat with Specified Tools:
Call the chat API with ‘tools‘ as an argument.
from openai import OpenAI
client = OpenAI(
api_key = "Enter your OpenAI API key here"
)
response = client.chat.completions.create(
model="gpt-3.5-turbo-1106",
temperature=0.0,
messages=[
{"role": "user", "content": "What's the weather like in New York, Paris, and Tunis? ""},
],
tools=tools,
tool_choice="auto", # auto is default, but we'll be explicit
)
# The weather in New York is 22 degrees Celsius and sunny, the weather in Paris is 10 degrees Celsius and cloudy, and the weather in Tunis is 22 degrees Celsius and sunny.
New arguments, ‘tools‘ and ‘tool_choice‘, are used here (previously it was ‘functions‘ and ‘function_call‘).
Check the Result:
Let’s see the result.
print(response.model_dump_json(indent=2))
# {
# "id": "chatcmpl-GaObVx2IeVjIDHZEqfL58IDRfTkpu",
# "choices": [
# {
# "finish_reason": "tool_calls",
# "index": 0,
# "message": {
# "content": null,
# "role": "assistant",
# "function_call": null,
# "tool_calls": [
# {
# "id": "call_uV9txPyhBN0txkkmAZrJnqkf",
# "function": {
# "arguments": "{\"location\": \"New York\"}",
# "name": "get_current_weather"
# },
# "type": "function"
# },
# {
# "id": "call_58NEZoZydCJaoRrEg9RjYDmu",
# "function": {
# "arguments": "{\"location\": \"Paris\"}",
# "name": "get_current_weather"
# },
# "type": "function"
# },
# {
# "id": "call_ffRtm29S95FtREGI3W8ZPFvW",
# "function": {
# "arguments": "{\"location\": \"Tunis\"}",
# "name": "get_current_weather"
# },
# "type": "function"
# }
# ]
# }
# }
# ],
# "created": 1701549510,
# "model": "gpt-3.5-turbo-1106",
# "object": "chat.completion",
# "system_fingerprint": "fp_ffee171130a",
# "usage": {
# "completion_tokens": 63,
# "prompt_tokens": 114,
# "total_tokens": 177
# }
# }
The « finish_reason » is « tool_calls« , and the message includes tool_calls.
From tool_calls, you can see the list of which tools should be called.
This allows multiple tools (functions in this case) to be called by the developer in one response.
Developer-Side Tool Calling:
You can continue the chat by storing and passing each tool’s result in the following format:
{
"tool_call_id": tool_call.id,
"role": "tool",
"name": function_name,
"content": function_response,
}
First, i get the result of the function call.
from openai.types.chat import ChatCompletionToolMessageParam, ChatCompletionUserMessageParam
tool_response_messages = []
if response.choices[0].message.tool_calls is not None:
tool_calls = response.choices[0].message.tool_calls
for tool_call in tool_calls:
if tool_call.type == "function":
function_call = tool_call.function
function_name = function_call.name
available_functions = [t.get("function").get("name") for t in tools if t.get("type") == "function"]
if function_name in available_functions:
function_arguments = function_call.arguments
function_response = eval(function_name)(**eval(function_arguments))
else:
raise Exception
tool_response_messages.append(
ChatCompletionToolMessageParam({
"tool_call_id": tool_call.id,
"role": "tool",
"content": function_response,
})
) # extend conversation with function response
Combine these tool results with the initial request and response into a single list.
history_messages = [
ChatCompletionUserMessageParam({"role": "user", "content": "What is the weather in Ne York, Paris, and Tunis ?"}),
response.choices[0].message,
*tool_response_messages
]
By passing this to the next request, you can get the continuation.
Getting the Continuation of the Chat:
Execute the following request to get the continuation.
response = client.chat.completions.create(
model="gpt-3.5-turbo-1106",
temperature=0.0,
messages=[
*history_messages
],
)
print(response["choices"][0]["message"]["content"])
# The weather in New York is 22 degrees Celsius and sunny, the weather in Paris is 10 degrees Celsius and cloudy, and the weather in Tunis is 22 degrees Celsius and sunny.
It worked correctly.
Conclusion:
How was it?
The usage is almost the same as the traditional Function calling, but attention is needed as the structure and argument names have changed.
I hope this article was helpful. 😀
