Public/New-ChatGPTConversation.ps1
function New-ChatGPTConversation { <# .SYNOPSIS Create a new ChatGPT conversation or get a Chat Completion result.(if you specify the prompt parameter) .DESCRIPTION Create a new ChatGPT conversation, You can chat with the openai service just like chat with a human. You can also get the chat completion result if you specify the prompt parameter. .PARAMETER api_key Your OpenAI API key, you can also set it in environment variable OPENAI_API_KEY or OPENAI_API_KEY_AZURE if you use Azure OpenAI API. If you use multiple environments, you can use OPENAI_API_KEY_AZURE_$environment to define the api key for each environment. .PARAMETER model The model to use for this request, you can also set it in environment variable OPENAI_CHAT_MODEL or OPENAI_CHAT_DEPLOYMENT_AZURE if you use Azure OpenAI API. If you use multiple environments, you can use OPENAI_CHAT_DEPLOYMENT_AZURE_$environment to define the model for each environment. You can use engine or deployment as the alias of model. .PARAMETER endpoint The endpoint to use for this request, you can also set it in environment variable OPENAI_ENDPOINT or OPENAI_ENDPOINT_AZURE if you use Azure OpenAI API. If you use multiple environments, you can use OPENAI_ENDPOINT_AZURE_$environment to define the endpoint for each environment. .PARAMETER azure if you use Azure OpenAI API, you can use this switch. .PARAMETER system The system prompt, this is a string, you can use it to define the role you want it be, for example, "You are a chatbot, please answer the user's question according to the user's language." If you provide a file path to this parameter, we will read the file as the system prompt. You can also specify a url to this parameter, we will read the url as the system prompt. You can read the prompt from a library (https://github.com/code365opensource/promptlibrary), by use "lib:xxxxx" as the prompt, for example, "lib:fitness". .PARAMETER prompt If you want to get result immediately, you can use this parameter to define the prompt. It will not start the chat conversation. If you provide a file path to this parameter, we will read the file as the prompt. You can also specify a url to this parameter, we will read the url as the prompt. You can read the prompt from a library (https://github.com/code365opensource/promptlibrary), by use "lib:xxxxx" as the prompt, for example, "lib:fitness". .PARAMETER config The dynamic settings for the API call, it can meet all the requirement for each model. please pass a custom object to this parameter, like @{temperature=1;max_tokens=1024} .PARAMETER environment The environment name, if you use Azure OpenAI API, you can use this parameter to define the environment name, it will be used to get the api key, model and endpoint from environment variable. If the environment is not exist, it will use the default environment. You can use env as the alias of this parameter. .PARAMETER api_version The api version, if you use Azure OpenAI API, you can use this parameter to define the api version, the default value is 2023-09-01-preview. .PARAMETER outFile If you want to save the result to a file, you can use this parameter to set the file path. .PARAMETER local If you want to use the local LLMs, like the model hosted by ollama, you can use this switch. You can also use "ollama" as the alias. .PARAMETER context If you want to pass some dymamic value to the prompt, you can use the context parameter here. It can be anything, you just specify a custom powershell object here. .EXAMPLE New-ChatGPTConversation Create a new ChatGPT conversation, use openai service with all the default settings. .EXAMPLE New-ChatGPTConverstaion -azure Create a new ChatGPT conversation, use Azure openai service with all the default settings. .EXAMPLE chat -azure Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure openai service with all the default settings. .EXAMPLE New-ChatGPTConversation -api_key "your api key" -model "your model name" Create a new ChatGPT conversation, use openai service with your api key and model name. .EXAMPLE New-ChatGPTConversation -api_key "your api key" -model "your deployment name" -azure Create a new ChatGPT conversation, use Azure openai service with your api key and deployment name. .EXAMPLE New-ChatGPTConversation -api_key "your api key" -model "your deployment name" -azure -system "You are a chatbot, please answer the user's question according to the user's language." Create a new ChatGPT conversation, use Azure openai service with your api key and deployment name, and define the system prompt. .EXAMPLE New-ChatGPTConversation -api_key "your api key" -model "your deployment name" -azure -system "You are a chatbot, please answer the user's question according to the user's language." -endpoint "https://api.openai.com/v1/completions" Create a new ChatGPT conversation, use Azure openai service with your api key and model id, and define the system prompt and endpoint. .EXAMPLE chat -azure -system "You are a chatbot, please answer the user's question according to the user's language." -environment "sweden" Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure openai service with the api key, model and endpoint defined in environment variable OPENAI_API_KEY_AZURE_SWEDEN, OPENAI_CHAT_DEPLOYMENT_AZURE_SWEDEN and OPENAI_ENDPOINT_AZURE_SWEDEN. .EXAMPLE chat -azure -api_version "2021-09-01-preview" Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure openai service with the api version 2021-09-01-preview. .EXAMPLE chat -azure -prompt "c:\temp\prompt.txt" Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure openai service with the prompt from file. .EXAMPLE chat -azure -system "c:\temp\system.txt" -prompt "c:\temp\prompt.txt" Create a new ChatGPT conversation by cmdlet's alias(chat), use Azure openai service with the system prompt and prompt from file. .EXAMPLE chat -local -model "llama3" Create a new ChatGPT conversation by using local LLMs, for example, the llama3. The default endpoint is http://localhost:11434/v1/chat/completions. You can modify this endpoint as well. .OUTPUTS System.String, the completion result. If you use stream mode, it will not return anything. .LINK https://github.com/chenxizhang/openai-powershell #> [CmdletBinding(DefaultParameterSetName = "default")] [Alias("chatgpt")][Alias("chat")][Alias("gpt")] param( [Parameter(ParameterSetName = "local", Mandatory = $true)] [Alias("ollama")] [switch]$local, [Parameter(ParameterSetName = "azure", Mandatory = $true)] [switch]$azure, [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [string]$api_key, [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local", Mandatory = $true)] [Alias("engine", "deployment")] [string]$model, [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local")] [string]$endpoint, [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local")] [string]$system = "You are a chatbot, please answer the user's question according to the user's language.", [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local")] [string]$prompt = "", [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local")] [PSCustomObject]$config, [Parameter(ParameterSetName = "azure")] [Alias("env")] [string]$environment, [Parameter(ParameterSetName = "azure")] [string]$api_version = "2023-09-01-preview", [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local")] [string]$outFile, [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local")] [switch]$json, [Parameter(ParameterSetName = "default")] [Parameter(ParameterSetName = "azure")] [Parameter(ParameterSetName = "local")] [PSCustomObject]$context ) BEGIN { Write-Verbose "Parameter received`n$($PSBoundParameters | Out-String)" Write-Verbose "Environment variable detected.`n$(Get-ChildItem Env:OPENAI_* | Out-String)" switch ($PSCmdlet.ParameterSetName) { "default" { $api_key = if ($api_key) { $api_key } else { $env:OPENAI_API_KEY } $model = if ($model) { $model } else { if ($env:OPENAI_CHAT_MODEL) { $env:OPENAI_CHAT_MODEL }else { "gpt-3.5-turbo" } } $endpoint = if ($endpoint) { $endpoint } else { "https://api.openai.com/v1/chat/completions" } } "azure" { $api_key = if ($api_key) { $api_key } else { Get-FirstNonNullItemInArray("OPENAI_API_KEY_AZURE_$environment", "OPENAI_API_KEY_AZURE") } $model = if ($model) { $model } else { Get-FirstNonNullItemInArray("OPENAI_CHAT_DEPLOYMENT_AZURE_$environment", "OPENAI_CHAT_DEPLOYMENT_AZURE") } $endpoint = if ($endpoint) { "{0}openai/deployments/$model/chat/completions?api-version=$api_version" -f $endpoint } else { "{0}openai/deployments/$model/chat/completions?api-version=$api_version" -f (Get-FirstNonNullItemInArray("OPENAI_ENDPOINT_AZURE_$environment", "OPENAI_ENDPOINT_AZURE")) } } "local" { $endpoint = if ($endpoint) { $endpoint }else { "http://localhost:11434/v1/chat/completions" } $api_key = if ($api_key) { $api_key } else { "local" } } } Write-Verbose "Parameter parsed. api_key: $api_key, model: $model, endpoint: $endpoint" $hasError = $false if ((!$azure) -and ((Test-OpenAIConnectivity) -eq $False)) { Write-Error $resources.openai_unavaliable $hasError = $true } if (!$api_key) { Write-Error $resources.error_missing_api_key $hasError = $true } if (!$model) { Write-Error $resources.error_missing_engine $hasError = $true } # if user didn't specify the stream parameter, and current powershell version is greater than 5, then use the stream mode if ($PSVersionTable['PSVersion'].Major -gt 5) { Write-Verbose "Powershell 6.0+ detected, stream mode is not specified, we will use the stream mode by default." $stream = $true } } PROCESS { if ($hasError) { return } $telemetries = @{ type = $PSCmdlet.ParameterSetName } # if prompt is not empty and it is a file, then read the file as the prompt $parsedprompt = Get-PromptContent($prompt) $prompt = $parsedprompt.content # if user provide the context, inject the data into the prompt by replace the context key with the context value if ($context) { Write-Verbose "Context received: $($context | ConvertTo-Json -Depth 10)" foreach ($key in $context.keys) { $prompt = $prompt -replace "{{$key}}", $context[$key] } Write-Verbose "Prompt after context injected: $prompt" } $telemetries.Add("promptType", $parsedprompt.type) $telemetries.Add("promptLib", $parsedprompt.lib) # if system is not empty and it is a file, then read the file as the system prompt $parsedsystem = Get-PromptContent($system) $system = $parsedsystem.content # if user provide the context, inject the data into the system prompt by replace the context key with the context value if ($context) { Write-Verbose "Context received: $($context | ConvertTo-Json -Depth 10)" foreach ($key in $context.keys) { $system = $system -replace "{{$key}}", $context[$key] } Write-Verbose "System prompt after context injected: $system" } $telemetries.Add("systemPromptType", $parsedsystem.type) $telemetries.Add("systemPromptLib", $parsedsystem.lib) # collect the telemetry data Submit-Telemetry -cmdletName $MyInvocation.MyCommand.Name -innovationName $MyInvocation.InvocationName -props $telemetries if ($prompt.Length -gt 0) { Write-Verbose "Prompt received: $prompt, so it is in prompt mode, not in chat mode." $messages = @( @{ role = "system" content = $system }, @{ role = "user" content = $prompt } ) $params = @{ Uri = $endpoint Method = "POST" Body = @{model = "$model"; messages = $messages } Headers = if ($azure) { @{"api-key" = "$api_key" } } else { @{"Authorization" = "Bearer $api_key" } } ContentType = "application/json;charset=utf-8" } if ($json) { $params.Body.Add("response_format" , @{type = "json_object" } ) } if ($config) { Merge-Hashtable -table1 $params.Body -table2 $config } $params.Body = ($params.Body | ConvertTo-Json -Depth 10) Write-Verbose "Prepare the params for Invoke-WebRequest: $($params|ConvertTo-Json -Depth 10)" $response = Invoke-RestMethod @params if ($PSVersionTable['PSVersion'].Major -eq 5) { Write-Verbose "Powershell 5.0 detected, convert the response to UTF8" $dstEncoding = [System.Text.Encoding]::GetEncoding('iso-8859-1') $srcEncoding = [System.Text.Encoding]::UTF8 $response.choices | ForEach-Object { $_.message.content = $srcEncoding.GetString([System.Text.Encoding]::Convert($srcEncoding, $dstEncoding, $srcEncoding.GetBytes($_.message.content))) } } Write-Verbose "Response converted to UTF8: $($response | ConvertTo-Json -Depth 10)" $result = $response.choices[0].message.content Write-Verbose "Response parsed to plain text: $result" #if user specify the outfile, write the response to the file if ($outFile) { Write-Verbose "Outfile specified, write the response to the file: $outFile" $result | Out-File -FilePath $outFile -Encoding utf8 } else { Write-Verbose "Outfile not specified, output the response to pipeline" Write-Output $result # if user does not specify the outfile, copy the response to clipboard Set-Clipboard $result Write-Host "Copied the response to clipboard." -ForegroundColor Green } } else { Write-Verbose "Prompt not received, so it is in chat mode." $index = 1; $welcome = "`n{0}`n{1}" -f ($resources.welcome_chatgpt -f $(if ($azure) { " $($resources.azure_version) " } else { "" }), $model), $resources.shortcuts Write-Host $welcome -ForegroundColor Yellow Write-Host $system -ForegroundColor Cyan $messages = @() $systemPrompt = @( [PSCustomObject]@{ role = "system" content = $system } ) Write-Verbose "Prepare the system prompt: $($systemPrompt|ConvertTo-Json -Depth 10)" while ($true) { Write-Verbose "Start a new loop - let's chat!" $current = $index++ $prompt = Read-Host -Prompt "`n[$current] $($resources.prompt)" Write-Verbose "Prompt received: $prompt" if ($prompt -in ("q", "bye")) { Write-Verbose "User pressed $prompt, so we will quit the chat." break } if ($prompt -eq "m") { $os = [System.Environment]::OSVersion.Platform if ($os -notin @([System.PlatformID]::Win32NT, [System.PlatformID]::Win32Windows, [System.PlatformID]::Win32S)) { Write-Host "Multi-line input is not supported on this platform. Please use another platform or use the file mode." continue } Write-Verbose "User pressed m, so we will prompt a window to collect user input in multi-lines mode." $prompt = Read-MultiLineInputBoxDialog -Message $resources.multi_line_prompt -WindowTitle $resources.multi_line_prompt -DefaultText "" Write-Verbose "Prompt received: $prompt" if ($null -eq $prompt) { Write-Host $resources.cancel_button_message continue } else { Write-Host "$($resources.multi_line_message)`n$prompt" } } if ($prompt -eq "f") { $os = [System.Environment]::OSVersion.Platform if ($os -notin @([System.PlatformID]::Win32NT, [System.PlatformID]::Win32Windows, [System.PlatformID]::Win32S)) { Write-Host "File input is not supported on this platform. Please use another platform or use the file input mode." continue } Write-Verbose "User pressed f, so we will prompt a window to collect user input from a file." $file = Read-OpenFileDialog -WindowTitle $resources.file_prompt Write-Verbose "File received: $file" if (!($file)) { Write-Host $resources.cancel_button_message continue } else { $prompt = Get-Content $file -Encoding utf8 Write-Host "$($resources.multi_line_message)`n$prompt" } } $messages += [PSCustomObject]@{ role = "user" content = $prompt } Write-Verbose "Prepare the messages: $($messages|ConvertTo-Json -Depth 10)" $params = @{ Uri = $endpoint Method = "POST" Body = @{model = "$model"; messages = ($systemPrompt + $messages[-5..-1]); stream = if ($stream) { $true }else { $false } } Headers = if ($azure) { @{"api-key" = "$api_key" } } else { @{"Authorization" = "Bearer $api_key" } } ContentType = "application/json;charset=utf-8" } if ($json) { $params.Body.Add("response_format" , @{type = "json_object" } ) } if ($config) { Merge-Hashtable -table1 $params.Body -table2 $config } $params.Body = ($params.Body | ConvertTo-Json -Depth 10) Write-Verbose "Prepare the params for Invoke-WebRequest: $($params|ConvertTo-Json -Depth 10)" try { if ($stream) { Write-Verbose "Stream mode detected, so we will use Invoke-WebRequest to stream the response." $client = New-Object System.Net.Http.HttpClient $body = $params.Body Write-Verbose "body: $body" $request = [System.Net.Http.HttpRequestMessage]::new() $request.Method = "POST" $request.RequestUri = $params.Uri $request.Headers.Clear() $request.Content = [System.Net.Http.StringContent]::new(($body), [System.Text.Encoding]::UTF8) $request.Content.Headers.Clear() $request.Content.Headers.Add("Content-Type", "application/json;charset=utf-8") if ($azure) { $request.Headers.Add("api-key", $api_key) } else { $request.Headers.Add("Authorization", "Bearer $api_key") } Write-Verbose "Prepared the client" $task = $client.Send($request) Write-Verbose "Got task result: $task" $response = $task.Content.ReadAsStream() $reader = [System.IO.StreamReader]::new($response) Write-Verbose "Got task stream response and reader: $response, $reader" $result = "" # message from the api Write-Host -ForegroundColor Red "`n[$current] " -NoNewline while ($true) { $line = $reader.ReadLine() Write-Verbose "Read line from stream: $line" if (($line -eq $null) -or ($line -eq "data: [DONE]")) { break } $chunk = ($line -replace "data: ", "" | ConvertFrom-Json).choices.delta.content Write-Host $chunk -NoNewline -ForegroundColor Green Write-Verbose "Chunk received: $chunk" $result += $chunk Start-Sleep -Milliseconds 50 } $reader.Close() $reader.Dispose() $messages += [PSCustomObject]@{ role = "assistant" content = $result } Write-Verbose "Message combined. $($messages|ConvertTo-Json -Depth 10)" Write-Host "" } else { Write-Verbose "It is not in stream mode." $stopwatch = [System.Diagnostics.Stopwatch]::StartNew() $response = Invoke-RestMethod @params Write-Verbose "Response received: $($response| ConvertTo-Json -Depth 10)" $stopwatch.Stop() $result = $response.choices[0].message.content $total_tokens = $response.usage.total_tokens $prompt_tokens = $response.usage.prompt_tokens $completion_tokens = $response.usage.completion_tokens Write-Verbose "Response parsed to plain text: $result, total_tokens: $total_tokens, prompt_tokens: $prompt_tokens, completion_tokens: $completion_tokens" if ($PSVersionTable['PSVersion'].Major -le 5) { Write-Verbose "Powershell 5.0 detected, convert the response to UTF8" $dstEncoding = [System.Text.Encoding]::GetEncoding('iso-8859-1') $srcEncoding = [System.Text.Encoding]::UTF8 $result = $srcEncoding.GetString([System.Text.Encoding]::Convert($srcEncoding, $dstEncoding, $srcEncoding.GetBytes($result))) Write-Verbose "Response converted to UTF8: $result" } $messages += [PSCustomObject]@{ role = "assistant" content = $result } Write-Verbose "Message combined. $($messages|ConvertTo-Json -Depth 10)" Write-Host -ForegroundColor Red ("`n[$current] $($resources.response)" -f $total_tokens, $prompt_tokens, $completion_tokens ) Write-Host $result -ForegroundColor Green } } catch { Write-Error $_ } } } } } # SIG # Begin signature block # MIIc/wYJKoZIhvcNAQcCoIIc8DCCHOwCAQExDzANBglghkgBZQMEAgEFADB5Bgor # BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG # KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCDYLt8XtFyZOBq/ # PW2begYRU69y4pqjqLsECU5Ey0iyJ6CCAyowggMmMIICDqADAgECAhBcsg5m3zM9 # kUZxmeNzIQNjMA0GCSqGSIb3DQEBCwUAMCoxKDAmBgNVBAMMH0NIRU5YSVpIQU5H # IC0gQ29kZSBTaWduaW5nIENlcnQwIBcNMjQwMTA4MTMwMjA0WhgPMjA5OTEyMzEx # NjAwMDBaMCoxKDAmBgNVBAMMH0NIRU5YSVpIQU5HIC0gQ29kZSBTaWduaW5nIENl # cnQwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDKDY3QG81JOKZG9jTb # QriDMDhq6gy93Pmoqgav9wErj+CgVvXKk+lGpUu74MWVyLUrJx8/ACb4b287wsXx # mQj8zQ3SqGn5CCjPKoAPsSbry0LOSl8bsFpwBr3YBJVL6cibhus2KLCbNu/u7sND # wyivKXYA1Iy1uTQPNVPcBx36krZTZyyE4CmngO75YbTMEzvHEjM3BIXdKtEt673t # iNOVSP6doh0zRwWEh2Y/eoOpv+FUokORwhKonxMtmIIET+ZPx7Ex+9aqHrliEabx # FsN4ETnuVT3rST++7Q2fquWFnl5scDnisFhU8JL8k+OGUzpLlo/nOpiRZkbKCEkZ # FCLhAgMBAAGjRjBEMA4GA1UdDwEB/wQEAwIHgDATBgNVHSUEDDAKBggrBgEFBQcD # AzAdBgNVHQ4EFgQUwcR3UUOZ6TxpBp9MxnBygyIMhUQwDQYJKoZIhvcNAQELBQAD # ggEBADwiE9nowKxUNN84BTk9an1ZkdU95ouj+q6MRbafH08u4XV7CxXpkPR8Za/c # BJWTOqCuz9pMPo0TylqWPm+++Tqy1OJ7Qewvy1+DXPuFGkTqY721uZ+YsHY3CueC # VSRZRNsWSYE9UxXXFRsjDu/M3+EvyaNDE4xQkwrP8obFJoHq7WaOCCD2wMbKjLb5 # bS/VgtOK7Yn9pU/ghrW+Em+zHOX87wNRh/I5jd+LsnY8bR6REzgdmogIyvD4dsJD # /IZLxRtbm2BHOn/aGBdu+GpEaYEEb6VkWcJhrQnpiNjjlu43CbRz5Bw14XPWGUDH # +EkUqkWS4h8zsRiyvR9Pnwklg6UxghkrMIIZJwIBATA+MCoxKDAmBgNVBAMMH0NI # RU5YSVpIQU5HIC0gQ29kZSBTaWduaW5nIENlcnQCEFyyDmbfMz2RRnGZ43MhA2Mw # DQYJYIZIAWUDBAIBBQCgfDAQBgorBgEEAYI3AgEMMQIwADAZBgkqhkiG9w0BCQMx # DAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYKKwYBBAGCNwIBFTAvBgkq # hkiG9w0BCQQxIgQgQqoZ74BkFOX9L1SHH2k9DBxaZbeedyY0g/ghjPsXchowDQYJ # KoZIhvcNAQEBBQAEggEAZFZU1M+8s5K1HVaHSseoCxx8JWYPhNjzLxQfND3zEmhq # +lHWDHUU8N3PtXrlR0AT98BXeO+lme7vyZ+I3dFvrMrzdV5adZB90bo8JRThSHHO # ck8cyXKSl+lbE6MBpdL0dY6bUF2EsL7ouyIVetW0CINPlImy1Ycgn6aNSRDdUhYl # PnKX52C8cQZPkZn2ic4d2VpKZkHUjyPV7/ZVEAFWG8bP0Sz2TB24+bOSl2+ww301 # s8rz+Pm0CJfjYkgaAAPIz+e8nRTVBk5Yk5lI6r8FNujbf3VC7egOdet7NBNcb6wB # U2IW8yMeFmzd6yxWTFiQI7J5uco+kx3JluOcTf+rlKGCF0Awghc8BgorBgEEAYI3 # AwMBMYIXLDCCFygGCSqGSIb3DQEHAqCCFxkwghcVAgEDMQ8wDQYJYIZIAWUDBAIB # BQAweAYLKoZIhvcNAQkQAQSgaQRnMGUCAQEGCWCGSAGG/WwHATAxMA0GCWCGSAFl # AwQCAQUABCCnfpEB0mY6rsDGjZ4aAV/VALxdH/UYTXnLXFxxdzPlBwIRAIhVHgdu # Vr7YpDXm2Na9t6wYDzIwMjQwNDI3MTI0ODEzWqCCEwkwggbCMIIEqqADAgECAhAF # RK/zlJ0IOaa/2z9f5WEWMA0GCSqGSIb3DQEBCwUAMGMxCzAJBgNVBAYTAlVTMRcw # FQYDVQQKEw5EaWdpQ2VydCwgSW5jLjE7MDkGA1UEAxMyRGlnaUNlcnQgVHJ1c3Rl # ZCBHNCBSU0E0MDk2IFNIQTI1NiBUaW1lU3RhbXBpbmcgQ0EwHhcNMjMwNzE0MDAw # MDAwWhcNMzQxMDEzMjM1OTU5WjBIMQswCQYDVQQGEwJVUzEXMBUGA1UEChMORGln # aUNlcnQsIEluYy4xIDAeBgNVBAMTF0RpZ2lDZXJ0IFRpbWVzdGFtcCAyMDIzMIIC # IjANBgkqhkiG9w0BAQEFAAOCAg8AMIICCgKCAgEAo1NFhx2DjlusPlSzI+DPn9fl # 0uddoQ4J3C9Io5d6OyqcZ9xiFVjBqZMRp82qsmrdECmKHmJjadNYnDVxvzqX65RQ # jxwg6seaOy+WZuNp52n+W8PWKyAcwZeUtKVQgfLPywemMGjKg0La/H8JJJSkghra # arrYO8pd3hkYhftF6g1hbJ3+cV7EBpo88MUueQ8bZlLjyNY+X9pD04T10Mf2SC1e # RXWWdf7dEKEbg8G45lKVtUfXeCk5a+B4WZfjRCtK1ZXO7wgX6oJkTf8j48qG7rSk # IWRw69XloNpjsy7pBe6q9iT1HbybHLK3X9/w7nZ9MZllR1WdSiQvrCuXvp/k/Xtz # PjLuUjT71Lvr1KAsNJvj3m5kGQc3AZEPHLVRzapMZoOIaGK7vEEbeBlt5NkP4FhB # +9ixLOFRr7StFQYU6mIIE9NpHnxkTZ0P387RXoyqq1AVybPKvNfEO2hEo6U7Qv1z # fe7dCv95NBB+plwKWEwAPoVpdceDZNZ1zY8SdlalJPrXxGshuugfNJgvOuprAbD3 # +yqG7HtSOKmYCaFxsmxxrz64b5bV4RAT/mFHCoz+8LbH1cfebCTwv0KCyqBxPZyS # kwS0aXAnDU+3tTbRyV8IpHCj7ArxES5k4MsiK8rxKBMhSVF+BmbTO77665E42FEH # ypS34lCh8zrTioPLQHsCAwEAAaOCAYswggGHMA4GA1UdDwEB/wQEAwIHgDAMBgNV # HRMBAf8EAjAAMBYGA1UdJQEB/wQMMAoGCCsGAQUFBwMIMCAGA1UdIAQZMBcwCAYG # Z4EMAQQCMAsGCWCGSAGG/WwHATAfBgNVHSMEGDAWgBS6FtltTYUvcyl2mi91jGog # j57IbzAdBgNVHQ4EFgQUpbbvE+fvzdBkodVWqWUxo97V40kwWgYDVR0fBFMwUTBP # oE2gS4ZJaHR0cDovL2NybDMuZGlnaWNlcnQuY29tL0RpZ2lDZXJ0VHJ1c3RlZEc0 # UlNBNDA5NlNIQTI1NlRpbWVTdGFtcGluZ0NBLmNybDCBkAYIKwYBBQUHAQEEgYMw # gYAwJAYIKwYBBQUHMAGGGGh0dHA6Ly9vY3NwLmRpZ2ljZXJ0LmNvbTBYBggrBgEF # BQcwAoZMaHR0cDovL2NhY2VydHMuZGlnaWNlcnQuY29tL0RpZ2lDZXJ0VHJ1c3Rl # ZEc0UlNBNDA5NlNIQTI1NlRpbWVTdGFtcGluZ0NBLmNydDANBgkqhkiG9w0BAQsF # AAOCAgEAgRrW3qCptZgXvHCNT4o8aJzYJf/LLOTN6l0ikuyMIgKpuM+AqNnn48Xt # JoKKcS8Y3U623mzX4WCcK+3tPUiOuGu6fF29wmE3aEl3o+uQqhLXJ4Xzjh6S2sJA # OJ9dyKAuJXglnSoFeoQpmLZXeY/bJlYrsPOnvTcM2Jh2T1a5UsK2nTipgedtQVyM # adG5K8TGe8+c+njikxp2oml101DkRBK+IA2eqUTQ+OVJdwhaIcW0z5iVGlS6ubzB # aRm6zxbygzc0brBBJt3eWpdPM43UjXd9dUWhpVgmagNF3tlQtVCMr1a9TMXhRsUo # 063nQwBw3syYnhmJA+rUkTfvTVLzyWAhxFZH7doRS4wyw4jmWOK22z75X7BC1o/j # F5HRqsBV44a/rCcsQdCaM0qoNtS5cpZ+l3k4SF/Kwtw9Mt911jZnWon49qfH5U81 # PAC9vpwqbHkB3NpE5jreODsHXjlY9HxzMVWggBHLFAx+rrz+pOt5Zapo1iLKO+ua # gjVXKBbLafIymrLS2Dq4sUaGa7oX/cR3bBVsrquvczroSUa31X/MtjjA2Owc9bah # uEMs305MfR5ocMB3CtQC4Fxguyj/OOVSWtasFyIjTvTs0xf7UGv/B3cfcZdEQcm4 # RtNsMnxYL2dHZeUbc7aZ+WssBkbvQR7w8F/g29mtkIBEr4AQQYowggauMIIElqAD # AgECAhAHNje3JFR82Ees/ShmKl5bMA0GCSqGSIb3DQEBCwUAMGIxCzAJBgNVBAYT # AlVTMRUwEwYDVQQKEwxEaWdpQ2VydCBJbmMxGTAXBgNVBAsTEHd3dy5kaWdpY2Vy # dC5jb20xITAfBgNVBAMTGERpZ2lDZXJ0IFRydXN0ZWQgUm9vdCBHNDAeFw0yMjAz # MjMwMDAwMDBaFw0zNzAzMjIyMzU5NTlaMGMxCzAJBgNVBAYTAlVTMRcwFQYDVQQK # Ew5EaWdpQ2VydCwgSW5jLjE7MDkGA1UEAxMyRGlnaUNlcnQgVHJ1c3RlZCBHNCBS # U0E0MDk2IFNIQTI1NiBUaW1lU3RhbXBpbmcgQ0EwggIiMA0GCSqGSIb3DQEBAQUA # A4ICDwAwggIKAoICAQDGhjUGSbPBPXJJUVXHJQPE8pE3qZdRodbSg9GeTKJtoLDM # g/la9hGhRBVCX6SI82j6ffOciQt/nR+eDzMfUBMLJnOWbfhXqAJ9/UO0hNoR8XOx # s+4rgISKIhjf69o9xBd/qxkrPkLcZ47qUT3w1lbU5ygt69OxtXXnHwZljZQp09ns # ad/ZkIdGAHvbREGJ3HxqV3rwN3mfXazL6IRktFLydkf3YYMZ3V+0VAshaG43IbtA # rF+y3kp9zvU5EmfvDqVjbOSmxR3NNg1c1eYbqMFkdECnwHLFuk4fsbVYTXn+149z # k6wsOeKlSNbwsDETqVcplicu9Yemj052FVUmcJgmf6AaRyBD40NjgHt1biclkJg6 # OBGz9vae5jtb7IHeIhTZgirHkr+g3uM+onP65x9abJTyUpURK1h0QCirc0PO30qh # HGs4xSnzyqqWc0Jon7ZGs506o9UD4L/wojzKQtwYSH8UNM/STKvvmz3+DrhkKvp1 # KCRB7UK/BZxmSVJQ9FHzNklNiyDSLFc1eSuo80VgvCONWPfcYd6T/jnA+bIwpUzX # 6ZhKWD7TA4j+s4/TXkt2ElGTyYwMO1uKIqjBJgj5FBASA31fI7tk42PgpuE+9sJ0 # sj8eCXbsq11GdeJgo1gJASgADoRU7s7pXcheMBK9Rp6103a50g5rmQzSM7TNsQID # AQABo4IBXTCCAVkwEgYDVR0TAQH/BAgwBgEB/wIBADAdBgNVHQ4EFgQUuhbZbU2F # L3MpdpovdYxqII+eyG8wHwYDVR0jBBgwFoAU7NfjgtJxXWRM3y5nP+e6mK4cD08w # DgYDVR0PAQH/BAQDAgGGMBMGA1UdJQQMMAoGCCsGAQUFBwMIMHcGCCsGAQUFBwEB # BGswaTAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQuY29tMEEGCCsG # AQUFBzAChjVodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGlnaUNlcnRUcnVz # dGVkUm9vdEc0LmNydDBDBgNVHR8EPDA6MDigNqA0hjJodHRwOi8vY3JsMy5kaWdp # Y2VydC5jb20vRGlnaUNlcnRUcnVzdGVkUm9vdEc0LmNybDAgBgNVHSAEGTAXMAgG # BmeBDAEEAjALBglghkgBhv1sBwEwDQYJKoZIhvcNAQELBQADggIBAH1ZjsCTtm+Y # qUQiAX5m1tghQuGwGC4QTRPPMFPOvxj7x1Bd4ksp+3CKDaopafxpwc8dB+k+YMjY # C+VcW9dth/qEICU0MWfNthKWb8RQTGIdDAiCqBa9qVbPFXONASIlzpVpP0d3+3J0 # FNf/q0+KLHqrhc1DX+1gtqpPkWaeLJ7giqzl/Yy8ZCaHbJK9nXzQcAp876i8dU+6 # WvepELJd6f8oVInw1YpxdmXazPByoyP6wCeCRK6ZJxurJB4mwbfeKuv2nrF5mYGj # VoarCkXJ38SNoOeY+/umnXKvxMfBwWpx2cYTgAnEtp/Nh4cku0+jSbl3ZpHxcpzp # SwJSpzd+k1OsOx0ISQ+UzTl63f8lY5knLD0/a6fxZsNBzU+2QJshIUDQtxMkzdwd # eDrknq3lNHGS1yZr5Dhzq6YBT70/O3itTK37xJV77QpfMzmHQXh6OOmc4d0j/R0o # 08f56PGYX/sr2H7yRp11LB4nLCbbbxV7HhmLNriT1ObyF5lZynDwN7+YAN8gFk8n # +2BnFqFmut1VwDophrCYoCvtlUG3OtUVmDG0YgkPCr2B2RP+v6TR81fZvAT6gt4y # 3wSJ8ADNXcL50CN/AAvkdgIm2fBldkKmKYcJRyvmfxqkhQ/8mJb2VVQrH4D6wPIO # K+XW+6kvRBVK5xMOHds3OBqhK/bt1nz8MIIFjTCCBHWgAwIBAgIQDpsYjvnQLefv # 21DiCEAYWjANBgkqhkiG9w0BAQwFADBlMQswCQYDVQQGEwJVUzEVMBMGA1UEChMM # RGlnaUNlcnQgSW5jMRkwFwYDVQQLExB3d3cuZGlnaWNlcnQuY29tMSQwIgYDVQQD # ExtEaWdpQ2VydCBBc3N1cmVkIElEIFJvb3QgQ0EwHhcNMjIwODAxMDAwMDAwWhcN # MzExMTA5MjM1OTU5WjBiMQswCQYDVQQGEwJVUzEVMBMGA1UEChMMRGlnaUNlcnQg # SW5jMRkwFwYDVQQLExB3d3cuZGlnaWNlcnQuY29tMSEwHwYDVQQDExhEaWdpQ2Vy # dCBUcnVzdGVkIFJvb3QgRzQwggIiMA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoIC # AQC/5pBzaN675F1KPDAiMGkz7MKnJS7JIT3yithZwuEppz1Yq3aaza57G4QNxDAf # 8xukOBbrVsaXbR2rsnnyyhHS5F/WBTxSD1Ifxp4VpX6+n6lXFllVcq9ok3DCsrp1 # mWpzMpTREEQQLt+C8weE5nQ7bXHiLQwb7iDVySAdYyktzuxeTsiT+CFhmzTrBcZe # 7FsavOvJz82sNEBfsXpm7nfISKhmV1efVFiODCu3T6cw2Vbuyntd463JT17lNecx # y9qTXtyOj4DatpGYQJB5w3jHtrHEtWoYOAMQjdjUN6QuBX2I9YI+EJFwq1WCQTLX # 2wRzKm6RAXwhTNS8rhsDdV14Ztk6MUSaM0C/CNdaSaTC5qmgZ92kJ7yhTzm1EVgX # 9yRcRo9k98FpiHaYdj1ZXUJ2h4mXaXpI8OCiEhtmmnTK3kse5w5jrubU75KSOp49 # 3ADkRSWJtppEGSt+wJS00mFt6zPZxd9LBADMfRyVw4/3IbKyEbe7f/LVjHAsQWCq # sWMYRJUadmJ+9oCw++hkpjPRiQfhvbfmQ6QYuKZ3AeEPlAwhHbJUKSWJbOUOUlFH # dL4mrLZBdd56rF+NP8m800ERElvlEFDrMcXKchYiCd98THU/Y+whX8QgUWtvsauG # i0/C1kVfnSD8oR7FwI+isX4KJpn15GkvmB0t9dmpsh3lGwIDAQABo4IBOjCCATYw # DwYDVR0TAQH/BAUwAwEB/zAdBgNVHQ4EFgQU7NfjgtJxXWRM3y5nP+e6mK4cD08w # HwYDVR0jBBgwFoAUReuir/SSy4IxLVGLp6chnfNtyA8wDgYDVR0PAQH/BAQDAgGG # MHkGCCsGAQUFBwEBBG0wazAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNl # cnQuY29tMEMGCCsGAQUFBzAChjdodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20v # RGlnaUNlcnRBc3N1cmVkSURSb290Q0EuY3J0MEUGA1UdHwQ+MDwwOqA4oDaGNGh0 # dHA6Ly9jcmwzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydEFzc3VyZWRJRFJvb3RDQS5j # cmwwEQYDVR0gBAowCDAGBgRVHSAAMA0GCSqGSIb3DQEBDAUAA4IBAQBwoL9DXFXn # OF+go3QbPbYW1/e/Vwe9mqyhhyzshV6pGrsi+IcaaVQi7aSId229GhT0E0p6Ly23 # OO/0/4C5+KH38nLeJLxSA8hO0Cre+i1Wz/n096wwepqLsl7Uz9FDRJtDIeuWcqFI # tJnLnU+nBgMTdydE1Od/6Fmo8L8vC6bp8jQ87PcDx4eo0kxAGTVGamlUsLihVo7s # pNU96LHc/RzY9HdaXFSMb++hUD38dglohJ9vytsgjTVgHAIDyyCwrFigDkBjxZgi # wbJZ9VVrzyerbHbObyMt9H5xaiNrIv8SuFQtJ37YOtnwtoeW/VvRXKwYw02fc7cB # qZ9Xql4o4rmUMYIDdjCCA3ICAQEwdzBjMQswCQYDVQQGEwJVUzEXMBUGA1UEChMO # RGlnaUNlcnQsIEluYy4xOzA5BgNVBAMTMkRpZ2lDZXJ0IFRydXN0ZWQgRzQgUlNB # NDA5NiBTSEEyNTYgVGltZVN0YW1waW5nIENBAhAFRK/zlJ0IOaa/2z9f5WEWMA0G # CWCGSAFlAwQCAQUAoIHRMBoGCSqGSIb3DQEJAzENBgsqhkiG9w0BCRABBDAcBgkq # hkiG9w0BCQUxDxcNMjQwNDI3MTI0ODEzWjArBgsqhkiG9w0BCRACDDEcMBowGDAW # BBRm8CsywsLJD4JdzqqKycZPGZzPQDAvBgkqhkiG9w0BCQQxIgQg56P97yi2fNBv # xHJOFGLDmiuqdTvXHKGGQNFUzO+6eI4wNwYLKoZIhvcNAQkQAi8xKDAmMCQwIgQg # 0vbkbe10IszR1EBXaEE2b4KK2lWarjMWr00amtQMeCgwDQYJKoZIhvcNAQEBBQAE # ggIAdafCkp7K+BT9lBBujIPH5cPSxFHbxhmgWlZuF09+vwhN+hgJ/On2KoAmk0So # VuZIvfm3q8Vwf4uZYAVIyOjewrnihpEwBVx/SXC3/ZR8j8DdpY/QL44VA2HmShTq # Or9sH/ik+UzeaaztLGCY/Mheut9rTUzUlLzmnX6odLq5rbBrjwPn0NU7uLGSEZAs # /nuMz858ehYezW09Y+9Jt+ilkhV+VNRpg0FySw3nFfaDyQvyG52ork9GrTKYAoHK # Epm/rD8kNOy2MGz1toHoqm+75DJfEXjOdRBODDAI04VI1La6r4TBNiJATPcaIykj # hWfmLROOcsbziENIR+Rv4/qvm/vlrfi931Pvj4h1Gw+yf2kR4LCqJNpt8TJx0nHt # 3DYWyq2rTLPsHLSavAFmuHmzEZnQ+nHTKAqCI9FGPNFrfMv20Fc6Pa5Xw2/Q8idF # ZG5qTp/LB1336ft5LAfFGafe08y9OFSE4AR50aVS0KRARULraRxNXch1Yh/fTIKE # I0Tcw9qjFl91Iz9RKqM4IBzgEUAiCX3JH1EcPh8CyqTpjYNmIHXYhFc1x3NemjZj # RblnQPdwrO6wbh0kXldO+f9bvCfOUwYjmeWZIAIyNVGV9qtHDDmn1lBpl4+Hgkz8 # nFbbxnXQ6s4K271DdGnalRxTcNzlJaPg8Is3qTmEldKvbfw= # SIG # End signature block |