o3-mini Model Error

Rizki Susanto 0 Reputation points
2025-03-10T11:05:42.0133333+00:00

Trying to test a request with o3-mini models but encountering the following error:

"error": 
{ 
"code": "InternalServerError", "error": 
{ 
"code": "InternalServerError", 
"message": "Gateway cannot authenticate upstream services. Please contact Microsoft for help." 
},

Other models like 4o and Deepsek R1 are working fine. The same issue occurs in the Foundry playground:

InternalServerError: Gateway cannot authenticate upstream services. Please contact Microsoft for help. | Apim-request-id: 0eb8749f-35fd-4d98-b3ee-6693999e25bf
Azure AI services
Azure AI services
A group of Azure services, SDKs, and APIs designed to make apps more intelligent, engaging, and discoverable.
3,262 questions
{count} votes

1 answer

Sort by: Most helpful
  1. JAYA SHANKAR G S 1,450 Reputation points Microsoft External Staff
    2025-03-13T05:02:48.7333333+00:00

    Hi @Rizki Susanto ,

    That's good to here, now you are able to get results from 03-mini.

    The reason it creates new resource while deploying is because the current ai service region is not supported with the model's deployment region.

    If you observe this document, for Global standard deployment o3-mini and o1 are not available in eastus, but available in eastus2.

    So, whenever you trying to deploy the model if there is no ai services available in the supported region it creates new one.

    If the above solution helps, please do accept it and give feedback clicking yes.

    Thank you


Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.