-
Notifications
You must be signed in to change notification settings - Fork 3.3k
azure-ai-inference : LangChain AzureAIChatCompletionsModel Does Not Support OpenAI-Style JSON Schema Format #44201
Copy link
Copy link
Open
Labels
AI Model InferenceIssues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)Service AttentionWorkflow: This issue is responsible by Azure service team.Workflow: This issue is responsible by Azure service team.customer-reportedIssues that are reported by GitHub users external to the Azure organization.Issues that are reported by GitHub users external to the Azure organization.needs-team-attentionWorkflow: This issue needs attention from Azure service team or SDK teamWorkflow: This issue needs attention from Azure service team or SDK teamquestionThe issue doesn't require a change to the product in order to be resolved. Most issues start as thatThe issue doesn't require a change to the product in order to be resolved. Most issues start as that
Description
-
Issue description
When usinglangchain_azure_ai.AzureAIChatCompletionsModelwith structured output (e.g., create_agent(..., response_format=ProviderStrategy()...), the integration fails with:
ValueError: Unsupportedresponse_format{'type': 'json_schema', 'json_schema': {...}}
This occurs inazure/ai/inference/_patch.py -
Steps to Reproduce
from langchain_azure_ai.chat_models import AzureAIChatCompletionsModel from pydantic import BaseModel, Field import os llm = AzureAIChatCompletionsModel( endpoint=os.getenv("GITHUB_INFERENCE_ENDPOINT"), credential=os.getenv("GITHUB_TOKEN"), model="gpt-4o", ) class ContactInfo(BaseModel): name: str = Field(description="The name of the person") email: str = Field(description="The email of the person") phone: str = Field(description="The phone number of the person") agent = create_agent( model = gllm_4_1, tools = [search_tool], response_format=ProviderStrategy(ContactInfo) ) agent.invoke({"messages": [HumanMessage(content="Extract contact info from: John Doe, john@example.com, (555) 123-4567")]}) -
Alternatives considered
Uselangchain-openaiinstead, which treats Azure-compatible endpoints as OpenAI endpoints.
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
AI Model InferenceIssues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)Issues related to the client library for Azure AI Model Inference (\sdk\ai\azure-ai-inference)Service AttentionWorkflow: This issue is responsible by Azure service team.Workflow: This issue is responsible by Azure service team.customer-reportedIssues that are reported by GitHub users external to the Azure organization.Issues that are reported by GitHub users external to the Azure organization.needs-team-attentionWorkflow: This issue needs attention from Azure service team or SDK teamWorkflow: This issue needs attention from Azure service team or SDK teamquestionThe issue doesn't require a change to the product in order to be resolved. Most issues start as thatThe issue doesn't require a change to the product in order to be resolved. Most issues start as that