Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Streaming for Parallel tool execution #922

Open
MayankShah1 opened this issue Aug 13, 2024 · 0 comments
Open

Streaming for Parallel tool execution #922

MayankShah1 opened this issue Aug 13, 2024 · 0 comments

Comments

@MayankShah1
Copy link

MayankShah1 commented Aug 13, 2024

Is your feature request related to a problem? Please describe.
I am interested in streaming responses for two parallel tool calls.

Describe the solution you'd like
I am trying to develop a comparison tool for summary information across two entities. The information is retrieved from data for the two entities of interest in parallel and then should be streamed. Then the final comparison tool across these information retrieved should be streamed. The streaming is done to ensure minimum latency on the UI.

Describe alternatives you've considered
I have tried leveraging the code for Parallel tools -

`
from future import annotations

import openai
import instructor

from typing import Iterable, Literal
from pydantic import BaseModel
from openai import OpenAI,AzureOpenAI

class Weather(BaseModel):
location: str
units: Literal["imperial", "metric"]

class GoogleSearch(BaseModel):
query: str

client = AzureOpenAI(
azure_endpoint=<API_ENDPOINT>
api_key=<API_KEY>,
api_version=<API_VERSION>,
)

client = instructor.from_openai(
client, mode=instructor.Mode.PARALLEL_TOOLS
)

function_calls = client.chat.completions.create(
model=<DEPLOYMENT_NAME>,
messages=[
{"role": "system", "content": "You must always use tools"},
{
"role": "user",
"content": "What is the weather in toronto and dallas and who won the super bowl?",
},
],
response_model=Iterable[Weather | GoogleSearch],
stream=True
)

for fc in function_calls:
print(fc)
#> location='Toronto' units='metric'
#> location='Dallas' units='imperial'
#> query='super bowl winner'
`

Additional context
Add any other context or screenshots about the feature request here.
Error : AssertionError: stream=True is not supported when using PARALLEL_TOOLS mode

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant