[Feature] First-class Azure AI Foundry support in Anthropic provider #9893
ClemCreator
started this conversation in
Feature Requests
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Problem
Right now, the Anthropic provider in Roo Code supports:
Docs: https://docs.roocode.com/providers/anthropic
With Azure AI Foundry / Microsoft Foundry, Anthropic’s Claude models (Sonnet 4.5, Opus 4.5, Haiku 4.5, etc.) can be deployed behind an Anthropic-compatible endpoint, for example:
https://<resource-name>.services.ai.azure.com/anthropichttps://<resource-name>.services.ai.azure.com/anthropic/v1/messagesx-api-keyor Microsoft Entra ID(See Microsoft docs for “Deploy and use Claude models in Microsoft Foundry”.)
When I configure Roo Code like this:
Anthropichttps://<resource-name>.services.ai.azure.com/anthropic…I can get something working with a Claude 4.5 Sonnet deployment.
However, I cannot get a Claude 4.5 Opus deployment to work, because of the model name:
modelfield must match my deployment name, e.g.claude-opus-4-5.So I’m blocked from using Claude 4.5 Opus hosted on Azure AI Foundry via the Anthropic provider, even though the endpoint is Anthropic-compatible.
There is a related bug when trying to hit Azure Anthropic via the OpenAI Compatible provider: Roo Code detects the endpoint as Azure AI Inference and appends
models/chat/completions, and also usesAuthorization: Bearerinstead ofx-api-key, which yields 401 errors (see issue #9467).#9467
Feature proposal
In the Anthropic provider settings, add dedicated support for Azure AI Foundry / Microsoft Foundry, something like:
When this checkbox is enabled:
Connection fields
https://<resource-name>.services.ai.azure.com/anthropicx-api-key), and/orAuthorization: Bearer <token>)Model list sourced from Foundry
Instead of the static Anthropic model id list, Roo Code would query the Foundry resource for available Claude deployments and show those deployment names in the Model dropdown. Conceptually, this is the list of deployments the user created for:
claude-sonnet-4-5claude-opus-4-5claude-haiku-4-5claude-opus-4-1The selected item would then be used as the
modelfield in the Anthropic Messages API request.Request wiring
/anthropic/v1/messages(nomodels/chat/completionssuffix).x-api-keyfor key-based auth when targeting Azure Anthropic.anthropic-version, etc.) as in the Microsoft Learn examples._isAzureAiInferenceOpenAI-style heuristic to these endpoints (see [BUG] Unable to use Anthropic via Foundry #9467).Model name behavior
modelfield should be exactly the Foundry deployment name (user-defined), not a hardcoded Anthropic model id with a date suffix.Why this matters
Azure AI Foundry / Microsoft Foundry is now one of the main ways to access Anthropic’s Claude models (including Claude Opus 4.5 and Sonnet 4.5) in enterprise environments, with quotas, governance, etc. managed in Azure.
Many enterprise users will have Claude on Foundry as their only allowed path to Anthropic models.
Having first-class Azure AI Foundry support in the Anthropic provider allows:
AnthropicWithout this, we have to rely on workarounds (OpenAI Compatible provider with special-cased logic, or trying to guess model ids), which breaks easily when model names or endpoints differ.
Willing to test
I’m happy to help test this with real Azure AI Foundry deployments (Claude Sonnet 4.5 and Claude Opus 4.5) and share logs / configs if needed.
Beta Was this translation helpful? Give feedback.
All reactions