How do I have a conversation with the Azure OpenAI/GPT API?

In my blog post “Getting Started with Azure OpenAI and PowerShell” I explored how to use the completion endpoint in Azure OpenAI to get answers to single questions/prompts with no context history. Now we are taking it to the next step, by having a conversation with the AI. So how do we actually do this? Did you know that in every API call, you need to include the whole conversation history to keep the context going?

If I send a question to the chat API like this:

Then you get the following response:
I am an AI language model designed to assist with answering questions and providing useful information.

Maybe you want to ask the API to elaborate on that, to get more details.
The next request then needs to look like this:

Maybe you noticed the order here? You start from the bottom with the System message (instruction for the AI), followed by the newest prompt/question. After this, you add the whole conversation history.

The big question now is, how much history do we keep? Does it make sense to keep the whole history, or should it be summarized? Do I store the conversation in SQL and use keywords from the database to base my context on? This comes down to cost and the type of model you are using.

GPT-3.5 has a limit of 4096 tokens, while GPT-4 has a model that supports up to 32768 tokens. The tokens are divided between prompts and completions. Prompts are the cost of sending a new question and the history, while completions are the cost of receiving the latest answer.

So what is a token?
The general rule is that every 4 characters is 1 token. This is dependent on the word/text type/length. OpenAI has a token calculator to figure out how much a prompt will cost. A text of about 500 characters I pasted, was calculated to be 100 tokens.

Azure OpenAI is pricing per 1000 tokens. So if I used the GPT4-32K model, the text above would cost me 0,006$ to send since it’s priced at 0,06$ for 1000 tokens in prompts. Then you need to see how many tokens the answer/completion is, which is priced at 0,12$ for 1000 tokens. This does not seem that expensive, but when you start feeding the API with large texts that you make multiple changes to, it can quickly add up if you are in a 10 000 user enterprise organization.

I have created a PowerShell script that uses the chat endpoint and the GPT-4 model to have a conversation. The script stores the context in a variable between each run of the Get-Chat function. It can be made more advanced, but this is more like a proof of concept to show you how to have a conversation and what possibilities are there.

If you read my Getting Started with Azure OpenAI and PowerShell post, you learn how to get the API Key you need and how to set up your environment/model. I’m using the same token function also, so you can authenticate with all 3 methods I’m using in my previous post.

function Get-AzureOpenAIToken{
<# .SYNOPSIS
Get an azure token for user or managed identity thats required to authenticate to Azure OpenAI with Rest API.
Also construct the header if you are using an Azure OpenAI API key instead of Azure AD authentication.
.PARAMETER ManagedIdentity
Use this parameter if you want to use a managed identity to authenticate to Azure OpenAI.
.PARAMETER User
Use this parameter if you want to use a user to authenticate to Azure OpenAI.
.PARAMETER APIKey
Use this parameter if you want to use an API key to authenticate to Azure OpenAI.
.EXAMPLE
# Manually specify username and password to acquire an authentication token:
Get-AzureOpenAIToken -APIKey "ghgkfhgfgfgkhgh"
Get-AzureOpenAIToken -ManagedIdentity $true
Get-AzureOpenAIToken -User $true
.NOTES
Author: Alexander Holmeset
Twitter: @AlexHolmeset
Website: https://www.alexholmeset.blog
Created: 09-02-2023
Updated:
Version history:
1.0.0 - (09-02-2023) Function created
#>
[CmdletBinding()]
param (
[Parameter(Mandatory=$false)]
[string]$APIKey,
[Parameter(Mandatory=$false)]
[string]$ManagedIdentity,
[Parameter(Mandatory=$false)]
[string]$User
)
Process {
$ErrorActionPreference = "Stop"
if (Get-Module -ListAvailable -Name Az.Accounts) {
# Write-Host "You have the Az.Accounts module installed"
}
else {
Write-Host "You need to install the Az.Accounts module";
break
}
If(!$MyHeader){
If($ManagedIdentity -eq $true){
"managed"
try {
Connect-AzAccount -Identity
$MyTokenRequest = Get-AzAccessToken -ResourceUrl "https://cognitiveservices.azure.com&quot;
$MyToken = $MyTokenRequest.token
If(!$MyToken){
Write-Warning "Failed to get API access token!"
Exit 1
}
$Global:MyHeader = @{"Authorization" = "Bearer $MyToken" }
}
catch [System.Exception] {
Write-Warning "Failed to get Access Token, Error message: $($_.Exception.Message)"; break
}
}
If($User -eq $true){
"USER"
try {
Connect-AzAccount
$MyTokenRequest = Get-AzAccessToken -ResourceUrl "https://cognitiveservices.azure.com&quot;
$MyToken = $MyTokenRequest.token
If(!$MyToken){
Write-Warning "Failed to get API access token!"
Exit 1
}
$Global:MyHeader = @{"Authorization" = "Bearer $MyToken" }
}
catch [System.Exception] {
Write-Warning "Failed to get Access Token, Error message: $($_.Exception.Message)"; break
}
}
If($APIkey){
"APIKEY"
$Global:MyHeader = @{"api-key" = $apikey }
}
}
}
}
function Get-Chat {
<# .SYNOPSIS
Get a resposne from the chat endpoint in Azure OpenAI.
.PARAMETER DeploymentName
A deployment name should be provided.
.PARAMETER ResourceName
A Resource name should be provided.
.PARAMETER Prompt
A prompt name should be provided.
.PARAMETER Token
A token name should be provided.
.EXAMPLE
Get-Chat -DeploymentName $DeploymentName -ResourceName $ResourceName -maxtokens 1000 -prompt "What is the meaning of life?" -AssitantInstruction $AssitantInstruction
.NOTES
Author: Alexander Holmeset
Twitter: @AlexHolmeset
Website: https://www.alexholmeset.blog
Created: 09-02-2023
Updated:
Version history:
1.0.0 - (09-02-2023) Function created
#>[CmdletBinding()]
param (
[parameter(Mandatory = $true, HelpMessage = "Your azure openai deployment name")]
[ValidateNotNullOrEmpty()]
[string]$DeploymentName,
[parameter(Mandatory = $true, HelpMessage = "your azure openai resource name")]
[ValidateNotNullOrEmpty()]
[string]$ResourceName,
[parameter(Mandatory = $true, HelpMessage = "Your Azure OpenAI prompt")]
[ValidateNotNullOrEmpty()]
[string]$Prompt,
[parameter(Mandatory = $true, HelpMessage = "Your Azure OpenAI service instructions")]
[ValidateNotNullOrEmpty()]
[string]$AssitantInstruction
)
Process {
$ErrorActionPreference = "Stop"
$APIVersion = "2023-03-15-preview"
# Construct URI
$uri = "https://$ResourceName.openai.azure.com/openai/deployments/$DeploymentName/chat/completions?api-version=$ApiVersion&quot;
$script:conversation += @"
{"role": "user", "content": "$prompt"},
"@
# Construct Body
$Body = @"
{
"messages":
[
$script:conversation
{"role": "system", "content": "$AssitantInstruction"}
]
}
"@
try {
$body
$Global:Request = invoke-restmethod -Method POST -Uri $uri -ContentType "application/json" -Body $body -Headers $Global:MyHeader
$script:conversation += @"
{"role": "assistant", "content": "$($Request.choices.message.content)"},
"@
}
catch [System.Exception] {
Write-Warning "Failed to to POST request: $($_.Exception.Message)"; break
}
"User: $Prompt"
"Assitant: $($Request.choices.message.content)"
}
}
Get-AzureOpenAIToken -APIKey "Enter API Token"
$DeploymentName = "Enter deploymentname"
$ResourceName = "Enter resource name"
$AssitantInstruction = "You are a helpfull assistant."
$Prompt = "What is Azure OpenAI?"
Get-Chat -DeploymentName $DeploymentName -ResourceName $ResourceName -Prompt $Prompt -AssitantInstruction $AssitantInstruction



One thought on “How do I have a conversation with the Azure OpenAI/GPT API?

  1. […] How do I have a conversation with the Azure OpenAI/GPT API? Ok, not me, but the author of the article. Leverage the power of Azure OpenAI GPT API to enhance your applications with natural language processing capabilities. Follow the step-by-step guide to set up the API and create a conversational AI model, allowing seamless interaction with users. Includes a code sample for easy implementation. […]

    Like

Leave a comment