-
Notifications
You must be signed in to change notification settings - Fork 3.2k
Description
- Package Name:
azure-ai-projects 2.0.0b2
azure-core 1.36.0 - Operating System:
w11 - Python Version:
tryed 3.10 and 3.12.10
Describe the bug
hi!
trying to migrate to new version foundry agent. (docs not clear)
im using as an example the one given in this repo.
responses.create(..) works fine, but when trying to stream responses ,i cant get it to work
this is my code:
openai_client = project_client.get_openai_client()
agent_model= get_agent_model(agent_name) #this just return the model config in foundry
print(agent_model)
# Optional Step: Create a conversation to use with the agent
conversation = openai_client.conversations.create()
print(f"Created conversation (id: {conversation.id})")
# Chat with the agent to answer questions
with openai_client.responses.stream(
conversation=conversation.id,
#model=agent_model,
extra_body={"agent": {"name": agent_name, "type": "agent_reference"}},
input="hi!",
) as stream:
for event in stream:
if event.type == "response.output_text.delta":
print(event.delta, end="", flush=True)
versions:
azure-ai-agents 1.1.0
azure-ai-projects 2.0.0b2
azure-core 1.35.1
console error:
Created conversation (id: conv_23ee7a807b7307ce00nGBsefUEqRE448CvhIyABlMEqNJOPqoQ)
Traceback (most recent call last):
File "c:\Users\Alejandro Piccardo\Desktop\cambios quantis\agente\BackendFastapi-Agent\dev\scripts\NewApi.py", line 36, in
with openai_client.responses.stream(
File "C:\Users\Alejandro Piccardo\Desktop\cambios quantis\agente\BackendFastapi-Agent\env\lib\site-packages\openai\resources\responses\responses.py", line 1005, in stream
raise ValueError("model must be provided when creating a new response")
ValueError: model must be provided when creating a new response
so the first thing i did was:
uncomment the model line and this error happend:
console error:
Created conversation (id: conv_24f10f51dc2a61b900s2HxndrRWcRPtFYEZOmp3iyODsSbiSgE)
Traceback (most recent call last):
File "c:\Users\Alejandro Piccardo\Desktop\cambios quantis\agente\BackendFastapi-Agent\dev\scripts\NewApi.py", line 36, in
with openai_client.responses.stream(
File "C:\Users\Alejandro Piccardo\Desktop\cambios quantis\agente\BackendFastapi-Agent\env\lib\site-packages\openai\lib\streaming\responses_responses.py", line 111, in enter
raw_stream = self.__api_request()
File "C:\Users\Alejandro Piccardo\Desktop\cambios quantis\agente\BackendFastapi-Agent\env\lib\site-packages\openai\resources\responses\responses.py", line 828, in create
return self._post(
File "C:\Users\Alejandro Piccardo\Desktop\cambios quantis\agente\BackendFastapi-Agent\env\lib\site-packages\openai_base_client.py", line 1259, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
File "C:\Users\Alejandro Piccardo\Desktop\cambios quantis\agente\BackendFastapi-Agent\env\lib\site-packages\openai_base_client.py", line 1047, in request
raise self._make_status_error_from_response(err.response) from None
openai.BadRequestError: Error code: 400 - {'error': {'code': 'invalid_payload', 'message': 'Not allowed when agent is specified.', 'param': 'model', 'type': 'invalid_request_error', 'details': [], 'additionalInfo': {'request_id': '7153967e656a11df001f063de6275551'}}}
what im doing wrong? or streams are not implemented correctly for agents??? can you guys provide me with an example response.stream(..)
thanks.