top of page
  • Writer's pictureAlibek Jakupov

Using OpenAI with Azure Managed Identities



Azure OpenAI Service offers REST API access to OpenAI's powerful language models such as the GPT-4, GPT-4 Turbo with Vision, GPT-3.5-Turbo, and Embeddings model series. Moreover, the new GPT-4 and GPT-3.5-Turbo model series are now generally available. These models can be easily customized for your specific task like content generation, summarization, image understanding, semantic search, and natural language to code translation. Users can access the service through REST APIs, Python SDK, or our web-based interface in the Azure OpenAI Studio.


The usual steps for using the Python SDK are as follows.  

  • You get the key and endpoint.

  • Set up persistent environment variables for your key and endpoint.

  • Use the following code to create and use the Azure OpenAI client.

client = AzureOpenAI(
  azure_endpoint = os.getenv("AZURE_OPENAI_ENDPOINT"), 
  api_key=os.getenv("AZURE_OPENAI_API_KEY"),  
  api_version="2024-02-01"
)

  • Pick your usage scenario (chat, completion, assistant etc.)


When you make a function app or a web app, you usually store these values in local.settings.json when testing your web service on localhost, or use the app configuration parameters and put your keys there if you publish your webservice to the cloud. In more advanced scenarios, you may store all your keys in Azure KeyVault, and access them directly from the application.


However, no matter how well you follow the best practices, some risks remain. For example, someone may hardcode secrets and keys in the source code or configuration files. This makes them visible to anyone who can access the code repository, the deployment environment, or the application itself. Someone from your team may store secrets and keys in plain text or in other insecure locations. Finally, for technical reasons, you may be not rotating or revoking secrets and keys regularly. This allows the secrets and keys to remain valid for longer periods of time and increases the chances of exploitation or exposure.


All of these factors make it hard for developers to manage the secrets, credentials, certificates, and keys that are used to secure communication between services. Azure managed identities eliminate the need for developers to handle these risks.


In this tutorial, we will build an Azure Function that connects to the OpenAI service by using managed identities. Up we go.


 

Start by making a simple Azure Function, and put it on Azure. Refer to this documentation for more details.


Next, you have to set up an Azure OpenAI resource and deployment. This is a simple process, just follow the steps in this reference. For the deployment, it’s a good idea to use the same name as the model name. After they are created, note down the deployment name and your open ai resource name. For this tutorial we are using GPT 4 vision, but any text model can work too.


Next, let’s set up a Managed Identity. As we mentioned before, developers can keep the secrets safely in Azure Key Vault or in App Settings, but services need a method to access Azure Key Vault. Managed identities offer an automatically managed identity in Microsoft Entra ID for applications to use when they connect to resources that support Microsoft Entra authentication. Applications can use managed identities to get Microsoft Entra tokens without having to handle any credentials.


Managed identities can be of two kinds:  

  • System-assigned. Some Azure resources let you turn on a managed identity right on the resource. When you turn on a system-assigned managed identity, a service principal of a special type is made in Microsoft Entra ID for the identity. The service principal is linked to the lifecycle of that Azure resource. When the Azure resource is removed, Azure automatically removes the service principal for you. By design, only that Azure resource can use this identity to get tokens from Microsoft Entra ID. You give the managed identity permission to access one or more services. The name of the system-assigned service principal is always the same as the name of the Azure resource it is made for. For a deployment slot, the name of its system-assigned identity is <app-name>/slots/<slot-name>.

  • User-assigned. You can also make a managed identity as a separate Azure resource. You can make a user-assigned managed identity  and assign it to one or more Azure Resources. When you turn on a user-assigned managed identity, a service principal of a special type is made in Microsoft Entra ID for the identity. The service principal is managed apart from the resources that use it. User-assigned identities can be used by multiple resources. You give the managed identity permission to access one or more services.


To create the necessary resources and manage the roles, you need "Owner" permissions at the right scope (your resource group or subscription).


In this tutorial, we will make a system-assigned managed identity. Enabling a system-assigned managed identity is a one-click experience. You can do it when you create your function app or in the properties of an existing function app. Go to your function app and on the left side find the Identity under the Settings section.





With managed identities for Azure resources, your application can obtain access tokens to authenticate to resources that use Microsoft Entra authentication. Therefore, you need to give our Function App's identity access to a resource in Azure Resource Manager, which is Azure Open AI in this case. We give the Reader role to the managed-identity at the scope of the resource.



Then, the important step. You need to make a token access the service. Set up azure identity in your virtual environment


 pip install azure-identity

Import the library


from azure.identity import DefaultAzureCredential, get_bearer_token_provider

And include the following code in your Azure function to generate a token


token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default")

Now install the openai python package

 

 pip install openai

In the same manner import the library


from openai import AzureOpenAI

Next, you need to initialize your OpenAI client


api_base = os.environ["GPT4V_ENDPOINT"]
deployment_name = os.environ["GPT4V_DEPLOYMENT"]
api_version = '2023-12-01-preview'

client = AzureOpenAI(
    azure_ad_token_provider=token_provider,
    api_version=api_version,
    azure_endpoint=f"{api_base}openai/deployments/{deployment_name}/extensions",
)

You might have observed that we are still relying on the endpoint and the deployment name, but we don't have any secrets in the code, which is a big benefit.


And this is it. We have improved the security of our Azure OpenAI service. Using our previous article as an example, this is our full code.


import azure.functions as func
import logging


import os
import base64
import json

from mimetypes import guess_type

from openai import AzureOpenAI
from azure.identity import DefaultAzureCredential, get_bearer_token_provider

app = func.FunctionApp(http_auth_level=func.AuthLevel.ADMIN)


@app.route(route="process_image")
def process_image_custom(req: func.HttpRequest) -> func.HttpResponse:
    instructions = str(req.params.get('instructions'))
    api_base = os.environ["GPT4V_ENDPOINT"]

    token_provider = get_bearer_token_provider(DefaultAzureCredential(), "https://cognitiveservices.azure.com/.default")

    deployment_name = os.environ["GPT4V_DEPLOYMENT"]
    api_version = '2023-12-01-preview'

    image = req.get_body()
    image_path = "/tmp/temp.jpg"

    with open(image_path, "wb") as image_bytes:
        image_bytes.write(image)

    encoded_image_url = local_image_to_data_url(image_path)

    client = AzureOpenAI(
        azure_ad_token_provider=token_provider,  
        api_version=api_version,
        base_url=f"{api_base}openai/deployments/{deployment_name}/extensions",
    )
    response = client.chat.completions.create(
        model=deployment_name,
        messages=[
            { "role": "system", "content": "You are a helpful assistant helping to identify agricultural vehicles, providing answers in russian" },
            { "role": "user", "content": [  
                { 
                    "type": "text", 
                    "text": instructions 
                },
                { 
                    "type": "image_url",
                    "image_url": {
                        "url": encoded_image_url
                    }
                }
            ] } 
        ],
        max_tokens=1000 
    )

    output = response.choices[0].message.content

    
    return func.HttpResponse(output)




def local_image_to_data_url(image_path):
    mime_type, _ = guess_type(image_path)
    if mime_type is None:
        mime_type = 'application/octet-stream'

    with open(image_path, "rb") as image_file:
        base64_encoded_data = base64.b64encode(image_file.read()).decode('utf-8')

    return f"data:{mime_type};base64,{base64_encoded_data}"


 

I hope you found this article helpful

73 views0 comments
bottom of page