Public/New-ChatGPTConversation.ps1
function New-ChatGPTConversation { <# .SYNOPSIS Create a new ChatGPT conversation or get a Chat Completion result if you specify the prompt parameter directly. .DESCRIPTION Create a new ChatGPT conversation, You can chat with the OpenAI service just like chat with a human. You can also get the chat completion result if you specify the prompt parameter. .PARAMETER api_key The API key to access OpenAI service, if not specified, the API key will be read from environment variable OPENAI_API_KEY. You can also use "token" or "access_token" or "accesstoken" as the alias. .PARAMETER model The model to use for this request, you can also set it in environment variable OPENAI_API_MODEL. If you are using Azure OpenAI Service, the model should be the deployment name you created in portal. .PARAMETER endpoint The endpoint to use for this request, you can also set it in environment variable OPENAI_API_ENDPOINT. You can also use some special value to specify the endpoint, like "ollama", "local", "kimi", "zhipu". .PARAMETER system The system prompt, this is a string, you can use it to define the role you want it be, for example, "You are a chatbot, please answer the user's question according to the user's language." If you provide a file path to this parameter, we will read the file as the system prompt. You can also specify a url to this parameter, we will read the url as the system prompt. You can read the prompt from a library (https://github.com/code365opensource/promptlibrary), by use "lib:xxxxx" as the prompt, for example, "lib:fitness". .PARAMETER prompt If you want to get result immediately, you can use this parameter to define the prompt. It will not start the chat conversation. If you provide a file path to this parameter, we will read the file as the prompt. You can also specify a url to this parameter, we will read the url as the prompt. You can read the prompt from a library (https://github.com/code365opensource/promptlibrary), by use "lib:xxxxx" as the prompt, for example, "lib:fitness". .PARAMETER config The dynamic settings for the API call, it can meet all the requirement for each model. please pass a custom object to this parameter, like @{temperature=1;max_tokens=1024}. .PARAMETER outFile If you want to save the result to a file, you can use this parameter to set the file path. You can also use "out" as the alias. .PARAMETER context If you want to pass some dymamic value to the prompt, you can use the context parameter here. It can be anything, you just specify a custom powershell object here. You define the variables in the system prompt or user prompt by using {{you_variable_name}} syntext, and then pass the data to the context parameter, like @{you_variable_name="your value"}. if there are multiple variables, you can use @{variable1="value1";variable2="value2"}. .PARAMETER headers If you want to pass some custom headers to the API call, you can use this parameter. You can pass a custom hashtable to this parameter, like @{header1="value1";header2="value2"}. .PARAMETER json Send the response in json format. .PARAMETER functions This is s super powerful feature to support the function_call of OpenAI, you can specify the function name(s) and it will be automatically called when the assistant needs it. You can find all the avaliable functions definition here (https://raw.githubusercontent.com/chenxizhang/openai-powershell/master/code365scripts.openai/Private/functions.json). .PARAMETER environment If you have multiple environment to use, you can specify the environment name here, and then define the environment in the profile.json file. You can also use "profile" or "env" as the alias. .PARAMETER env_config The profile.json file path, the default value is "$env:USERPROFILE/.openai-powershell/profile.json". .EXAMPLE New-ChatGPTConversation Use OpenAI Service with all the default settings, will read the API key from environment variable (OPENAI_API_KEY), enter the chat mode. .EXAMPLE New-ChatGPTConversation -api_key "your api key" -model "gpt-3.5-turbo" Use OpenAI Service with the specified api key and model, enter the chat mode. .EXAMPLE chat -system "You help me to translate the text to Chinese." Use OpenAI Service to translate text (system prompt specified), will read the API key from environment variable (OPENAI_API_KEY), enter the chat mode. .EXAMPLE chat -endpoint "ollama" -model "llama3" Use OpenAI Service with the local model, enter the chat mode. .EXAMPLE chat -endpoint $endpoint $env:OPENAI_API_ENDPOINT_AZURE -model $env:OPENAI_API_MODEL_AZURE -api_key $env:OPENAI_API_KEY_AZURE Use Azure OpenAI Service with the specified api key and model, enter the chat mode. .EXAMPLE gpt -system "Translate the text to Chinese." -prompt "Hello, how are you?" Use OpenAI Service to translate text (system prompt specified), will read the API key from environment variable (OPENAI_API_KEY), model from OPENAI_API_MODEL (if present) or use "gpt-3.5-turbo" as default, get the chat completion result directly. .EXAMPLE "Hello, how are you?" | gpt -system "Translate the text to Chinese." Use OpenAI Service to translate text (system prompt specified, user prompt will pass from pipeline), will read the API key from environment variable (OPENAI_API_KEY), model from OPENAI_API_MODEL (if present) or use "gpt-3.5-turbo" as default, get the chat completion result directly. .OUTPUTS System.String, the completion result. .LINK https://github.com/chenxizhang/openai-powershell #> [CmdletBinding()] [Alias("chatgpt")][Alias("chat")] param( [Alias("token", "access_token", "accesstoken", "key", "apikey")] [string]$api_key, [Alias("engine", "deployment")] [string]$model, [string]$endpoint, [string]$system = "You are a chatbot, please answer the user's question according to the user's language.", [Alias("settings")] [PSCustomObject]$config, [Alias("out")] [string]$outFile, [switch]$json, [Alias("variables")] [PSCustomObject]$context, [PSCustomObject]$headers, [string[]]$functions, [Alias("profile", "env")] [string]$environment, [string]$env_config = "$env:USERPROFILE/.openai-powershell/profile.json" ) BEGIN { Write-Verbose ($resources.verbose_parameters_received -f ($PSBoundParameters | Out-String)) Write-Verbose ($resources.verbose_environment_received -f (Get-ChildItem Env:OPENAI_API_* | Out-String)) if ($environment) { if ($env_config -match "\.json$" -and (Test-Path $env_config -PathType Leaf)) { $env_config = Get-Content $env_config -Raw -Encoding UTF8 } $parsed_env_config = ($env_config | ConvertFrom-Json | ConvertTo-Hashtable).profiles | Where-Object { $_.name -eq $environment } | Select-Object -First 1 if ($parsed_env_config) { if ($parsed_env_config.api_key -and (!$api_key)) { $api_key = $parsed_env_config.api_key } if ($parsed_env_config.model -and (!$model)) { $model = $parsed_env_config.model } if ($parsed_env_config.endpoint -and (!$endpoint)) { $endpoint = $parsed_env_config.endpoint } if ($parsed_env_config.config) { if ($config) { Merge-Hashtable -table1 $config -table2 $parsed_env_config.config } else { $config = $parsed_env_config.config } } if ($parsed_env_config.headers) { # foreach all the headers, if the value contains {{model}} then replace it with the model, and if the value contains {{guid}} then replace it with a new guid $keys = @($parsed_env_config.headers.Keys) $keys | ForEach-Object { $parsed_env_config.headers[$_] = $parsed_env_config.headers[$_] -replace "{{model}}", $model $parsed_env_config.headers[$_] = $parsed_env_config.headers[$_] -replace "{{guid}}", [guid]::NewGuid().ToString() } if ($headers) { Merge-Hashtable -table1 $headers -table2 $parsed_env_config.headers } else { $headers = $parsed_env_config.headers } } if ($parsed_env_config.auth -and ($parsed_env_config.auth.type -eq "aad") -and $parsed_env_config.auth.aad) { Confirm-DependencyModule -ModuleName "MSAL.ps" $aad = $parsed_env_config.auth.aad if ($aad.clientsecret) { $aad.clientsecret = ConvertTo-SecureString $aad.clientsecret -AsPlainText -Force } $accesstoken = (Get-MsalToken @aad).AccessToken $api_key = $accesstoken } # if user provide the functions definition, then merge the functions definition to the config if ($parsed_env_config.functions) { if ($functions) { $functions += $parsed_env_config.functions } else { $functions = $parsed_env_config.functions } } } } $api_key = ($api_key, [System.Environment]::GetEnvironmentVariable("OPENAI_API_KEY") | Where-Object { $_.Length -gt 0 } | Select-Object -First 1) $model = ($model, [System.Environment]::GetEnvironmentVariable("OPENAI_API_MODEL"), "gpt-3.5-turbo" | Where-Object { $_.Length -gt 0 } | Select-Object -First 1) $endpoint = ($endpoint, [System.Environment]::GetEnvironmentVariable("OPENAI_API_ENDPOINT"), "https://api.openai.com/v1/chat/completions" | Where-Object { $_.Length -gt 0 } | Select-Object -First 1) $endpoint = switch ($endpoint) { { $_ -in ("ollama", "local") } { "http://localhost:11434/v1/chat/completions" } "kimi" { "https://api.moonshot.cn/v1/chat/completions" } "zhipu" { "https://open.bigmodel.cn/api/paas/v4/chat/completions" } default { $endpoint } } # if use local model, and api_key is not specify, then generate a random key if ($endpoint -eq "http://localhost:11434/v1/chat/completions" -and !$api_key) { $api_key = "local" } Write-Verbose ($resources.verbose_parameters_parsed -f $api_key, $model, $endpoint) $hasError = $false if (!$api_key) { Write-Error $resources.error_missing_api_key $hasError = $true } if (!$model) { Write-Error $resources.error_missing_engine $hasError = $true } if (!$endpoint) { Write-Error $resources.error_missing_endpoint $hasError = $true } if ($hasError) { return } # if endpoint contains ".openai.azure.com", then people wants to use azure openai service, try to concat the endpoint with the model if ($endpoint.EndsWith("openai.azure.com/")) { $version = Get-AzureAPIVersion $endpoint += "openai/deployments/$model/chat/completions?api-version=$version" } # add databricks support, it will use the basic authorization method, not the bearer token $azure = $endpoint.Contains("openai.azure.com") $header = if ($azure) { # if the apikey is a jwt, then use the bearer token in authorization header if ($api_key -match "^ey[a-zA-Z0-9-_]+\.[a-zA-Z0-9-_]+\.[a-zA-Z0-9-_]+$") { @{"Authorization" = "Bearer $api_key" } } else { @{"api-key" = "$api_key" } } } else { # dbrx instruct use the basic authorization method @{"Authorization" = "$(if($endpoint.Contains("databricks-dbrx-instruct")){"Basic"}else{"Bearer"}) $api_key" } } # if user provide the headers, merge the headers to the default headers if ($headers) { Merge-Hashtable -table1 $header -table2 $headers } # if user provide the functions, get the functions from the functions file and define the tools and tool_choice thoughs the config parameter if ($functions) { $tools = @(Get-PredefinedFunctions -names $functions) Write-Verbose ($tools | ConvertTo-Json -Depth 10) if ($tools.Count -gt 0) { if ($null -eq $config) { $config = @{} } $config["tools"] = $tools $config["tool_choice"] = "auto" } } $telemetries = @{ type = switch ($endpoint) { { $_ -match "openai.azure.com" } { "azure" } { $_ -match "localhost" } { "local" } { $_ -match "databricks-dbrx" } { "dbrx" } { $_ -match "api.openai.com" } { "openai" } { $_ -match "platform.moonshot.cn" } { "kimi" } { $_ -match "open.bigmodel.cn" } { "zhipu" } default { $endpoint } } } # if system is not empty and it is a file, then read the file as the system prompt $parsedsystem = Get-PromptContent -prompt $system -context $context $system = $parsedsystem.content $telemetries.Add("systemPromptType", $parsedsystem.type) $telemetries.Add("systemPromptLib", $parsedsystem.lib) # collect the telemetry data Submit-Telemetry -cmdletName $MyInvocation.MyCommand.Name -innovationName $MyInvocation.InvocationName -props $telemetries } PROCESS { Receive-Job -Name "check_openai_UpdateNotification" -ErrorAction SilentlyContinue Write-Verbose ($resources.verbose_chat_mode) # old version of powershell doesn't support the stream mode, functions are not supported in the stream mode $stream = $PSVersionTable['PSVersion'].Major -gt 5 $index = 1; $welcome = "`n{0}`n{1}" -f ($resources.welcome_chatgpt -f $(if ($azure) { " $($resources.azure_version) " } else { "" }), $model), $resources.shortcuts Write-Host $welcome -ForegroundColor Yellow Write-Host $system -ForegroundColor Cyan $messages = @() $systemPrompt = @( [PSCustomObject]@{ role = "system" content = $system } ) $messages += $systemPrompt Write-Verbose "$($systemPrompt|ConvertTo-Json -Depth 10)" while ($true) { Write-Verbose ($resources.verbose_chat_let_chat) $current = $index++ $prompt = Read-Host -Prompt "`n[$current] $($resources.prompt)" Write-Verbose ($resources.verbose_prompt_received -f $prompt) if ($prompt -in ("q", "bye")) { Write-Verbose ($resources.verbose_chat_q_message -f $prompt) break } if ($prompt.StartsWith("save")) { if ($prompt -match "^save\s+(\S+)((?:\s+/override))?$") { $profileName = $matches[1] $override = $null -ne $matches[2] $profile_to_save = @{ name = $profileName api_key = $api_key model = $model endpoint = $endpoint system = $system } if ($config) { $profile_to_save.config = $config } if ($headers) { $profile_to_save.headers = $headers } if ($functions) { $profile_to_save.functions = $functions } # check the profile file in $userprofile directory, if not exist, then create it $profileFile = Join-Path $env:USERPROFILE ".openai-powershell/profile.json" if (!(Test-Path $profileFile)) { New-Item -Path $profileFile -ItemType File -Force | Out-Null @{profiles = @($profile_to_save) } | ConvertTo-Json -Depth 10 | Set-Content -Path $profileFile -Encoding UTF8 } else { # load the profile file, and check if the profile name is already exist, if exist and not override, then return error message, if exist and override, then override the profile. if not exist, then add the profile to the profile file $existing_profiles = (Get-Content $profileFile -Raw -Encoding UTF8 | ConvertFrom-Json).profiles $existing_profile = $existing_profiles | Where-Object { $_.name -eq $profileName } if ($existing_profile) { if ($override) { # update the existing_profile with the new profile_to_save $existing_profile = $profile_to_save @{profiles = @($existing_profiles) } | ConvertTo-Json -Depth 10 | Set-Content -Path $profileFile -Encoding UTF8 Write-Host "[$current] The profile '$profileName' is overridden successfully." -ForegroundColor Green } else { Write-Host "[$current] The profile '$profileName' is already exist, if you want to override it, please add the '/override' switch in the end of your command" -ForegroundColor Red } } else { $existing_profiles += $profile_to_save @{profiles = @($existing_profiles) } | ConvertTo-Json -Depth 10 | Set-Content -Path $profileFile -Encoding UTF8 Write-Host "[$current] The profile '$profileName' is saved successfully." -ForegroundColor Green } } } else { Write-Host "[$current] You want to save the profile, but the syntext is incorrect. Please try 'save your-profile-name', if you want to override the exiting profile, please add the '/override' switch in the end of your command" -ForegroundColor Red } continue } if ($prompt -eq "m") { $os = [System.Environment]::OSVersion.Platform if ($os -notin @([System.PlatformID]::Win32NT, [System.PlatformID]::Win32Windows, [System.PlatformID]::Win32S)) { Write-Host ($resources.verbose_chat_m_message_not_supported) continue } Write-Verbose ($resources.verbose_chat_m_message) $prompt = Read-MultiLineInputBoxDialog -Message $resources.multi_line_prompt -WindowTitle $resources.multi_line_prompt -DefaultText "" Write-Verbose ($resources.verbose_prompt_received -f $prompt) if ($null -eq $prompt) { Write-Host $resources.cancel_button_message continue } else { Write-Host "$($resources.multi_line_message)`n$prompt" } } if ($prompt -eq "f") { $os = [System.Environment]::OSVersion.Platform if ($os -notin @([System.PlatformID]::Win32NT, [System.PlatformID]::Win32Windows, [System.PlatformID]::Win32S)) { Write-Host ($resources.verbose_chat_f_message_not_supported) continue } Write-Verbose ($resources.verbose_chat_f_message) $file = Read-OpenFileDialog -WindowTitle $resources.file_prompt Write-Verbose ($resources.verbose_chat_file_read -f $file) if (!($file)) { Write-Host $resources.cancel_button_message continue } else { $prompt = Get-Content $file -Encoding utf8 Write-Host "$($resources.multi_line_message)`n$prompt" } } Write-Host -ForegroundColor ("blue", "red", "Green", "yellow", "gray", "black", "white" | Get-Random) ("`r$($resources.thinking) {0}" -f ("." * (Get-Random -Maximum 10 -Minimum 3))) -NoNewline $messages += [PSCustomObject]@{ role = "user" content = $prompt } Write-Verbose ($resources.verbose_prepare_messages -f ($messages | ConvertTo-Json -Depth 10)) if ($messages.Count -gt 10) { $messages = @($messages[0]) + $messages[-9..-1] } $body = @{model = "$model"; messages = $messages; stream = $stream } $params = @{ Uri = $endpoint Method = "POST" Headers = $header } if ($json) { $body.Add("response_format" , @{type = "json_object" } ) } if ($config) { Merge-Hashtable -table1 $body -table2 $config } $params.Body = ($body | ConvertTo-Json -Depth 10) Write-Verbose ($resources.verbose_prepare_params -f ($params | ConvertTo-Json -Depth 10)) try { if ($stream) { Write-Verbose ($resources.verbose_chat_stream_mode) $callapi = Invoke-StreamWebRequest -uri $params.Uri -body $params.Body -header $header # if the status of callapi is not ok, then write the error message to host and continue if ($callapi.status -ne "ok") { Write-Host "`r[$current] $($callapi.message)" -NoNewline -ForegroundColor Red Write-Host "" continue } # otherwise, get the read of the result $reader = $callapi.reader # check if tools_call is null, if not, then execute the tools_call $line = $reader.ReadLine() $delta = ($line -replace "data: ", "" | ConvertFrom-Json).choices.delta while ($delta -and ($null -eq $delta.content)) { $tool_calls = @() while ($true) { if ($delta.tool_calls) { $temp = $delta.tool_calls if ($temp.id -and $temp.function) { $tool_calls += @([pscustomobject]@{ index = $temp.index id = $temp.id type = "function" function = @{ name = $temp.function.name arguments = $temp.function.arguments } }) } elseif ($temp.function) { $tool_calls | Where-Object { $_.index -eq $temp.index } | ForEach-Object { $_.function.arguments += $temp.function.arguments } } } $line = $reader.ReadLine() if ($line -eq "data: [DONE]") { break } $delta = ($line -replace "data: ", "" | ConvertFrom-Json).choices.delta } # execute functions $messages += [pscustomobject]@{ role = "assistant" content = "" tool_calls = @($tool_calls) } foreach ($tool in $tool_calls) { Write-Host ("`r$($resources.function_call): $($tool.function.name)" + (" " * 50)) -NoNewline $function_args = $tool.function.arguments | ConvertFrom-Json $tool_response = Invoke-Expression ("{0} {1}" -f $tool.function.name, ( $function_args.PSObject.Properties | ForEach-Object { "-{0} {1}" -f $_.Name, $_.Value } ) -join " ") $messages += @{ role = "tool" name = $tool.function.name tool_call_id = $tool.id content = $tool_response } } $body.messages = $messages $params.Body = ($body | ConvertTo-Json -Depth 10) $callapi = Invoke-StreamWebRequest -uri $params.Uri -body $params.Body -header $header if ($callapi.status -ne "ok") { Write-Host "`r[$current] $($callapi.message)" -NoNewline -ForegroundColor Red Write-Host "" break } $reader = $callapi.reader $line = $reader.ReadLine() $delta = ($line -replace "data: ", "" | ConvertFrom-Json).choices.delta } # if the callapi status is not ok, then write the error message to host and continue if ($callapi.status -ne "ok") { continue } Write-Host ("`r" + (" " * 50)) -ForegroundColor Green -NoNewline Write-Host "`r[$current] " -NoNewline -ForegroundColor Red $result = $delta.content Write-Host $result -NoNewline -ForegroundColor Green while ($true) { $line = $reader.ReadLine() if ($line -eq "data: [DONE]") { break } $chunk = ($line -replace "data: ", "" | ConvertFrom-Json).choices.delta.content Write-Host $chunk -NoNewline -ForegroundColor Green $result += $chunk Start-Sleep -Milliseconds 5 } Write-Host "" $messages += [PSCustomObject]@{ role = "assistant" content = $result } Write-Verbose ($resources.verbose_chat_message_combined -f ($messages | ConvertTo-Json -Depth 10)) } else { Write-Verbose ($resources.verbose_chat_not_stream_mode) $response = Invoke-UniWebRequest $params Write-Verbose ($resources.verbose_chat_response_received -f ($response | ConvertTo-Json -Depth 10)) # TODO #175 将工具作为外部模块加载,而不是直接调用 while ($response.choices -and $response.choices[0].message.tool_calls) { # add the assistant message $this_message = $response.choices[0].message # $body.messages += $this_message $tool_calls = $this_message.tool_calls $messages += [pscustomobject]@{ role = "assistant" content = "" tool_calls = @($tool_calls) } foreach ($tool in $tool_calls) { Write-Host ("`r$($resources.function_call): $($tool.function.name)" + (" " * 50)) -NoNewline $function_args = $tool.function.arguments | ConvertFrom-Json $tool_response = Invoke-Expression ("{0} {1}" -f $tool.function.name, ( $function_args.PSObject.Properties | ForEach-Object { "-{0} {1}" -f $_.Name, $_.Value } ) -join " ") $messages += @{ role = "tool" name = $tool.function.name tool_call_id = $tool.id content = $tool_response } } $body.messages = $messages $params.Body = ($body | ConvertTo-Json -Depth 10) Write-Verbose $params.Body $response = Invoke-UniWebRequest $params } $result = $response.choices[0].message.content $messages += [PSCustomObject]@{ role = "assistant" content = $result } Write-Host ("`r" + (" " * 50)) -ForegroundColor Green -NoNewline Write-Host "`r[$current] $result" -ForegroundColor Green Write-Verbose ($resources.verbose_chat_message_combined -f ($messages | ConvertTo-Json -Depth 10)) } } catch { Write-Error ($_.Exception.Message) } } } } # SIG # Begin signature block # MIIc/gYJKoZIhvcNAQcCoIIc7zCCHOsCAQExDzANBglghkgBZQMEAgEFADB5Bgor # BgEEAYI3AgEEoGswaTA0BgorBgEEAYI3AgEeMCYCAwEAAAQQH8w7YFlLCE63JNLG # KX7zUQIBAAIBAAIBAAIBAAIBADAxMA0GCWCGSAFlAwQCAQUABCBo8U+0N9Wx3uyr # 1+EfBJ6Dm+JQzJL6pJc09214SaCjhqCCAyowggMmMIICDqADAgECAhBcsg5m3zM9 # kUZxmeNzIQNjMA0GCSqGSIb3DQEBCwUAMCoxKDAmBgNVBAMMH0NIRU5YSVpIQU5H # IC0gQ29kZSBTaWduaW5nIENlcnQwIBcNMjQwMTA4MTMwMjA0WhgPMjA5OTEyMzEx # NjAwMDBaMCoxKDAmBgNVBAMMH0NIRU5YSVpIQU5HIC0gQ29kZSBTaWduaW5nIENl # cnQwggEiMA0GCSqGSIb3DQEBAQUAA4IBDwAwggEKAoIBAQDKDY3QG81JOKZG9jTb # QriDMDhq6gy93Pmoqgav9wErj+CgVvXKk+lGpUu74MWVyLUrJx8/ACb4b287wsXx # mQj8zQ3SqGn5CCjPKoAPsSbry0LOSl8bsFpwBr3YBJVL6cibhus2KLCbNu/u7sND # wyivKXYA1Iy1uTQPNVPcBx36krZTZyyE4CmngO75YbTMEzvHEjM3BIXdKtEt673t # iNOVSP6doh0zRwWEh2Y/eoOpv+FUokORwhKonxMtmIIET+ZPx7Ex+9aqHrliEabx # FsN4ETnuVT3rST++7Q2fquWFnl5scDnisFhU8JL8k+OGUzpLlo/nOpiRZkbKCEkZ # FCLhAgMBAAGjRjBEMA4GA1UdDwEB/wQEAwIHgDATBgNVHSUEDDAKBggrBgEFBQcD # AzAdBgNVHQ4EFgQUwcR3UUOZ6TxpBp9MxnBygyIMhUQwDQYJKoZIhvcNAQELBQAD # ggEBADwiE9nowKxUNN84BTk9an1ZkdU95ouj+q6MRbafH08u4XV7CxXpkPR8Za/c # BJWTOqCuz9pMPo0TylqWPm+++Tqy1OJ7Qewvy1+DXPuFGkTqY721uZ+YsHY3CueC # VSRZRNsWSYE9UxXXFRsjDu/M3+EvyaNDE4xQkwrP8obFJoHq7WaOCCD2wMbKjLb5 # bS/VgtOK7Yn9pU/ghrW+Em+zHOX87wNRh/I5jd+LsnY8bR6REzgdmogIyvD4dsJD # /IZLxRtbm2BHOn/aGBdu+GpEaYEEb6VkWcJhrQnpiNjjlu43CbRz5Bw14XPWGUDH # +EkUqkWS4h8zsRiyvR9Pnwklg6UxghkqMIIZJgIBATA+MCoxKDAmBgNVBAMMH0NI # RU5YSVpIQU5HIC0gQ29kZSBTaWduaW5nIENlcnQCEFyyDmbfMz2RRnGZ43MhA2Mw # DQYJYIZIAWUDBAIBBQCgfDAQBgorBgEEAYI3AgEMMQIwADAZBgkqhkiG9w0BCQMx # DAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYKKwYBBAGCNwIBFTAvBgkq # hkiG9w0BCQQxIgQg0xTB/8eRVZ3hjVy1mNctYXS6FxR+4SqoXvQewfbE/XAwDQYJ # KoZIhvcNAQEBBQAEggEAYlt9uZEfjxeP7sbFd+vk31Bc2t/NvVLws+oivc+6KZ6N # ZWfoXbDkBX6JUPrMu4CkpGvq19jQnD0Aeqi81yoEkk2gKVEFLh8eHudqEAFEGge7 # fzI0FIhWUckvZR8pNIoqTRdnH5pb3QTBZCZfzM7LaR8oy+jg2+Qg0K8OUxJSy2Jn # JDZjKikkHJFAgjqzYFdTloxtzcDDIBy2niUySV+vw3MniIJJ07VY6Ir/o18sYwoF # ZUSd6V0qiQYqnTWmXBB4qIy7DQCG68O4YTLdsAVGHI07V4A5LAol6DqP4d8+Bz0z # zD3UeJBZxZBnTXQZwc8ESyinvz+tqRV7f+NQVauU2KGCFz8wghc7BgorBgEEAYI3 # AwMBMYIXKzCCFycGCSqGSIb3DQEHAqCCFxgwghcUAgEDMQ8wDQYJYIZIAWUDBAIB # BQAwdwYLKoZIhvcNAQkQAQSgaARmMGQCAQEGCWCGSAGG/WwHATAxMA0GCWCGSAFl # AwQCAQUABCCRMU7BEduMVm5NrNsZEhWY05KxLFBDJ172wrqamUW0mwIQPD+vTA9G # xTBRX75XkUq/9xgPMjAyNDA2MTQwMTE1MThaoIITCTCCBsIwggSqoAMCAQICEAVE # r/OUnQg5pr/bP1/lYRYwDQYJKoZIhvcNAQELBQAwYzELMAkGA1UEBhMCVVMxFzAV # BgNVBAoTDkRpZ2lDZXJ0LCBJbmMuMTswOQYDVQQDEzJEaWdpQ2VydCBUcnVzdGVk # IEc0IFJTQTQwOTYgU0hBMjU2IFRpbWVTdGFtcGluZyBDQTAeFw0yMzA3MTQwMDAw # MDBaFw0zNDEwMTMyMzU5NTlaMEgxCzAJBgNVBAYTAlVTMRcwFQYDVQQKEw5EaWdp # Q2VydCwgSW5jLjEgMB4GA1UEAxMXRGlnaUNlcnQgVGltZXN0YW1wIDIwMjMwggIi # MA0GCSqGSIb3DQEBAQUAA4ICDwAwggIKAoICAQCjU0WHHYOOW6w+VLMj4M+f1+XS # 512hDgncL0ijl3o7Kpxn3GIVWMGpkxGnzaqyat0QKYoeYmNp01icNXG/OpfrlFCP # HCDqx5o7L5Zm42nnaf5bw9YrIBzBl5S0pVCB8s/LB6YwaMqDQtr8fwkklKSCGtpq # utg7yl3eGRiF+0XqDWFsnf5xXsQGmjzwxS55DxtmUuPI1j5f2kPThPXQx/ZILV5F # dZZ1/t0QoRuDwbjmUpW1R9d4KTlr4HhZl+NEK0rVlc7vCBfqgmRN/yPjyobutKQh # ZHDr1eWg2mOzLukF7qr2JPUdvJscsrdf3/Dudn0xmWVHVZ1KJC+sK5e+n+T9e3M+ # Mu5SNPvUu+vUoCw0m+PebmQZBzcBkQ8ctVHNqkxmg4hoYru8QRt4GW3k2Q/gWEH7 # 2LEs4VGvtK0VBhTqYggT02kefGRNnQ/fztFejKqrUBXJs8q818Q7aESjpTtC/XN9 # 7t0K/3k0EH6mXApYTAA+hWl1x4Nk1nXNjxJ2VqUk+tfEayG66B80mC866msBsPf7 # Kobse1I4qZgJoXGybHGvPrhvltXhEBP+YUcKjP7wtsfVx95sJPC/QoLKoHE9nJKT # BLRpcCcNT7e1NtHJXwikcKPsCvERLmTgyyIryvEoEyFJUX4GZtM7vvrrkTjYUQfK # lLfiUKHzOtOKg8tAewIDAQABo4IBizCCAYcwDgYDVR0PAQH/BAQDAgeAMAwGA1Ud # EwEB/wQCMAAwFgYDVR0lAQH/BAwwCgYIKwYBBQUHAwgwIAYDVR0gBBkwFzAIBgZn # gQwBBAIwCwYJYIZIAYb9bAcBMB8GA1UdIwQYMBaAFLoW2W1NhS9zKXaaL3WMaiCP # nshvMB0GA1UdDgQWBBSltu8T5+/N0GSh1VapZTGj3tXjSTBaBgNVHR8EUzBRME+g # TaBLhklodHRwOi8vY3JsMy5kaWdpY2VydC5jb20vRGlnaUNlcnRUcnVzdGVkRzRS # U0E0MDk2U0hBMjU2VGltZVN0YW1waW5nQ0EuY3JsMIGQBggrBgEFBQcBAQSBgzCB # gDAkBggrBgEFBQcwAYYYaHR0cDovL29jc3AuZGlnaWNlcnQuY29tMFgGCCsGAQUF # BzAChkxodHRwOi8vY2FjZXJ0cy5kaWdpY2VydC5jb20vRGlnaUNlcnRUcnVzdGVk # RzRSU0E0MDk2U0hBMjU2VGltZVN0YW1waW5nQ0EuY3J0MA0GCSqGSIb3DQEBCwUA # A4ICAQCBGtbeoKm1mBe8cI1PijxonNgl/8ss5M3qXSKS7IwiAqm4z4Co2efjxe0m # gopxLxjdTrbebNfhYJwr7e09SI64a7p8Xb3CYTdoSXej65CqEtcnhfOOHpLawkA4 # n13IoC4leCWdKgV6hCmYtld5j9smViuw86e9NwzYmHZPVrlSwradOKmB521BXIxp # 0bkrxMZ7z5z6eOKTGnaiaXXTUOREEr4gDZ6pRND45Ul3CFohxbTPmJUaVLq5vMFp # GbrPFvKDNzRusEEm3d5al08zjdSNd311RaGlWCZqA0Xe2VC1UIyvVr1MxeFGxSjT # redDAHDezJieGYkD6tSRN+9NUvPJYCHEVkft2hFLjDLDiOZY4rbbPvlfsELWj+MX # kdGqwFXjhr+sJyxB0JozSqg21Llyln6XeThIX8rC3D0y33XWNmdaifj2p8flTzU8 # AL2+nCpseQHc2kTmOt44OwdeOVj0fHMxVaCAEcsUDH6uvP6k63llqmjWIso765qC # NVcoFstp8jKastLYOrixRoZruhf9xHdsFWyuq69zOuhJRrfVf8y2OMDY7Bz1tqG4 # QyzfTkx9HmhwwHcK1ALgXGC7KP845VJa1qwXIiNO9OzTF/tQa/8Hdx9xl0RBybhG # 02wyfFgvZ0dl5Rtztpn5aywGRu9BHvDwX+Db2a2QgESvgBBBijCCBq4wggSWoAMC # AQICEAc2N7ckVHzYR6z9KGYqXlswDQYJKoZIhvcNAQELBQAwYjELMAkGA1UEBhMC # VVMxFTATBgNVBAoTDERpZ2lDZXJ0IEluYzEZMBcGA1UECxMQd3d3LmRpZ2ljZXJ0 # LmNvbTEhMB8GA1UEAxMYRGlnaUNlcnQgVHJ1c3RlZCBSb290IEc0MB4XDTIyMDMy # MzAwMDAwMFoXDTM3MDMyMjIzNTk1OVowYzELMAkGA1UEBhMCVVMxFzAVBgNVBAoT # DkRpZ2lDZXJ0LCBJbmMuMTswOQYDVQQDEzJEaWdpQ2VydCBUcnVzdGVkIEc0IFJT # QTQwOTYgU0hBMjU2IFRpbWVTdGFtcGluZyBDQTCCAiIwDQYJKoZIhvcNAQEBBQAD # ggIPADCCAgoCggIBAMaGNQZJs8E9cklRVcclA8TykTepl1Gh1tKD0Z5Mom2gsMyD # +Vr2EaFEFUJfpIjzaPp985yJC3+dH54PMx9QEwsmc5Zt+FeoAn39Q7SE2hHxc7Gz # 7iuAhIoiGN/r2j3EF3+rGSs+QtxnjupRPfDWVtTnKC3r07G1decfBmWNlCnT2exp # 39mQh0YAe9tEQYncfGpXevA3eZ9drMvohGS0UvJ2R/dhgxndX7RUCyFobjchu0Cs # X7LeSn3O9TkSZ+8OpWNs5KbFHc02DVzV5huowWR0QKfAcsW6Th+xtVhNef7Xj3OT # rCw54qVI1vCwMROpVymWJy71h6aPTnYVVSZwmCZ/oBpHIEPjQ2OAe3VuJyWQmDo4 # EbP29p7mO1vsgd4iFNmCKseSv6De4z6ic/rnH1pslPJSlRErWHRAKKtzQ87fSqEc # azjFKfPKqpZzQmiftkaznTqj1QPgv/CiPMpC3BhIfxQ0z9JMq++bPf4OuGQq+nUo # JEHtQr8FnGZJUlD0UfM2SU2LINIsVzV5K6jzRWC8I41Y99xh3pP+OcD5sjClTNfp # mEpYPtMDiP6zj9NeS3YSUZPJjAw7W4oiqMEmCPkUEBIDfV8ju2TjY+Cm4T72wnSy # Px4JduyrXUZ14mCjWAkBKAAOhFTuzuldyF4wEr1GnrXTdrnSDmuZDNIztM2xAgMB # AAGjggFdMIIBWTASBgNVHRMBAf8ECDAGAQH/AgEAMB0GA1UdDgQWBBS6FtltTYUv # cyl2mi91jGogj57IbzAfBgNVHSMEGDAWgBTs1+OC0nFdZEzfLmc/57qYrhwPTzAO # BgNVHQ8BAf8EBAMCAYYwEwYDVR0lBAwwCgYIKwYBBQUHAwgwdwYIKwYBBQUHAQEE # azBpMCQGCCsGAQUFBzABhhhodHRwOi8vb2NzcC5kaWdpY2VydC5jb20wQQYIKwYB # BQUHMAKGNWh0dHA6Ly9jYWNlcnRzLmRpZ2ljZXJ0LmNvbS9EaWdpQ2VydFRydXN0 # ZWRSb290RzQuY3J0MEMGA1UdHwQ8MDowOKA2oDSGMmh0dHA6Ly9jcmwzLmRpZ2lj # ZXJ0LmNvbS9EaWdpQ2VydFRydXN0ZWRSb290RzQuY3JsMCAGA1UdIAQZMBcwCAYG # Z4EMAQQCMAsGCWCGSAGG/WwHATANBgkqhkiG9w0BAQsFAAOCAgEAfVmOwJO2b5ip # RCIBfmbW2CFC4bAYLhBNE88wU86/GPvHUF3iSyn7cIoNqilp/GnBzx0H6T5gyNgL # 5Vxb122H+oQgJTQxZ822EpZvxFBMYh0MCIKoFr2pVs8Vc40BIiXOlWk/R3f7cnQU # 1/+rT4osequFzUNf7WC2qk+RZp4snuCKrOX9jLxkJodskr2dfNBwCnzvqLx1T7pa # 96kQsl3p/yhUifDVinF2ZdrM8HKjI/rAJ4JErpknG6skHibBt94q6/aesXmZgaNW # hqsKRcnfxI2g55j7+6adcq/Ex8HBanHZxhOACcS2n82HhyS7T6NJuXdmkfFynOlL # AlKnN36TU6w7HQhJD5TNOXrd/yVjmScsPT9rp/Fmw0HNT7ZAmyEhQNC3EyTN3B14 # OuSereU0cZLXJmvkOHOrpgFPvT87eK1MrfvElXvtCl8zOYdBeHo46Zzh3SP9HSjT # x/no8Zhf+yvYfvJGnXUsHicsJttvFXseGYs2uJPU5vIXmVnKcPA3v5gA3yAWTyf7 # YGcWoWa63VXAOimGsJigK+2VQbc61RWYMbRiCQ8KvYHZE/6/pNHzV9m8BPqC3jLf # BInwAM1dwvnQI38AC+R2AibZ8GV2QqYphwlHK+Z/GqSFD/yYlvZVVCsfgPrA8g4r # 5db7qS9EFUrnEw4d2zc4GqEr9u3WfPwwggWNMIIEdaADAgECAhAOmxiO+dAt5+/b # UOIIQBhaMA0GCSqGSIb3DQEBDAUAMGUxCzAJBgNVBAYTAlVTMRUwEwYDVQQKEwxE # aWdpQ2VydCBJbmMxGTAXBgNVBAsTEHd3dy5kaWdpY2VydC5jb20xJDAiBgNVBAMT # G0RpZ2lDZXJ0IEFzc3VyZWQgSUQgUm9vdCBDQTAeFw0yMjA4MDEwMDAwMDBaFw0z # MTExMDkyMzU5NTlaMGIxCzAJBgNVBAYTAlVTMRUwEwYDVQQKEwxEaWdpQ2VydCBJ # bmMxGTAXBgNVBAsTEHd3dy5kaWdpY2VydC5jb20xITAfBgNVBAMTGERpZ2lDZXJ0 # IFRydXN0ZWQgUm9vdCBHNDCCAiIwDQYJKoZIhvcNAQEBBQADggIPADCCAgoCggIB # AL/mkHNo3rvkXUo8MCIwaTPswqclLskhPfKK2FnC4SmnPVirdprNrnsbhA3EMB/z # G6Q4FutWxpdtHauyefLKEdLkX9YFPFIPUh/GnhWlfr6fqVcWWVVyr2iTcMKyunWZ # anMylNEQRBAu34LzB4TmdDttceItDBvuINXJIB1jKS3O7F5OyJP4IWGbNOsFxl7s # Wxq868nPzaw0QF+xembud8hIqGZXV59UWI4MK7dPpzDZVu7Ke13jrclPXuU15zHL # 2pNe3I6PgNq2kZhAkHnDeMe2scS1ahg4AxCN2NQ3pC4FfYj1gj4QkXCrVYJBMtfb # BHMqbpEBfCFM1LyuGwN1XXhm2ToxRJozQL8I11pJpMLmqaBn3aQnvKFPObURWBf3 # JFxGj2T3wWmIdph2PVldQnaHiZdpekjw4KISG2aadMreSx7nDmOu5tTvkpI6nj3c # AORFJYm2mkQZK37AlLTSYW3rM9nF30sEAMx9HJXDj/chsrIRt7t/8tWMcCxBYKqx # YxhElRp2Yn72gLD76GSmM9GJB+G9t+ZDpBi4pncB4Q+UDCEdslQpJYls5Q5SUUd0 # viastkF13nqsX40/ybzTQRESW+UQUOsxxcpyFiIJ33xMdT9j7CFfxCBRa2+xq4aL # T8LWRV+dIPyhHsXAj6KxfgommfXkaS+YHS312amyHeUbAgMBAAGjggE6MIIBNjAP # BgNVHRMBAf8EBTADAQH/MB0GA1UdDgQWBBTs1+OC0nFdZEzfLmc/57qYrhwPTzAf # BgNVHSMEGDAWgBRF66Kv9JLLgjEtUYunpyGd823IDzAOBgNVHQ8BAf8EBAMCAYYw # eQYIKwYBBQUHAQEEbTBrMCQGCCsGAQUFBzABhhhodHRwOi8vb2NzcC5kaWdpY2Vy # dC5jb20wQwYIKwYBBQUHMAKGN2h0dHA6Ly9jYWNlcnRzLmRpZ2ljZXJ0LmNvbS9E # aWdpQ2VydEFzc3VyZWRJRFJvb3RDQS5jcnQwRQYDVR0fBD4wPDA6oDigNoY0aHR0 # cDovL2NybDMuZGlnaWNlcnQuY29tL0RpZ2lDZXJ0QXNzdXJlZElEUm9vdENBLmNy # bDARBgNVHSAECjAIMAYGBFUdIAAwDQYJKoZIhvcNAQEMBQADggEBAHCgv0NcVec4 # X6CjdBs9thbX979XB72arKGHLOyFXqkauyL4hxppVCLtpIh3bb0aFPQTSnovLbc4 # 7/T/gLn4offyct4kvFIDyE7QKt76LVbP+fT3rDB6mouyXtTP0UNEm0Mh65ZyoUi0 # mcudT6cGAxN3J0TU53/oWajwvy8LpunyNDzs9wPHh6jSTEAZNUZqaVSwuKFWjuyk # 1T3osdz9HNj0d1pcVIxv76FQPfx2CWiEn2/K2yCNNWAcAgPLILCsWKAOQGPFmCLB # sln1VWvPJ6tsds5vIy30fnFqI2si/xK4VC0nftg62fC2h5b9W9FcrBjDTZ9ztwGp # n1eqXijiuZQxggN2MIIDcgIBATB3MGMxCzAJBgNVBAYTAlVTMRcwFQYDVQQKEw5E # aWdpQ2VydCwgSW5jLjE7MDkGA1UEAxMyRGlnaUNlcnQgVHJ1c3RlZCBHNCBSU0E0 # MDk2IFNIQTI1NiBUaW1lU3RhbXBpbmcgQ0ECEAVEr/OUnQg5pr/bP1/lYRYwDQYJ # YIZIAWUDBAIBBQCggdEwGgYJKoZIhvcNAQkDMQ0GCyqGSIb3DQEJEAEEMBwGCSqG # SIb3DQEJBTEPFw0yNDA2MTQwMTE1MThaMCsGCyqGSIb3DQEJEAIMMRwwGjAYMBYE # FGbwKzLCwskPgl3OqorJxk8ZnM9AMC8GCSqGSIb3DQEJBDEiBCAgemUr9vlhuLT7 # I8OFeIFyha/WYZH6hRVofJWvQzdV9DA3BgsqhkiG9w0BCRACLzEoMCYwJDAiBCDS # 9uRt7XQizNHUQFdoQTZvgoraVZquMxavTRqa1Ax4KDANBgkqhkiG9w0BAQEFAASC # AgBeHckGXqTecpMafVVgu7broeByRFz+56ELLVu9T3HXh0BrPyTsEbrP0bsTzpmY # HEnXO5UNoXmtoufLZKR1k46WRX2IsAf43uHQ4Yxz6fvWfA+HV3eIfZsq8XQkBhFk # nqdpmjNmyxBKM/XbuAlVPNkmAndmn/fBMiL3uxwAHmQLogoAhlwm77h8pQJHSvAm # ZzyF9wzSRQdcuXkGwx42FPD4DblCIhr87lMR2PqP/C1aRPxpVefufZ2MibiulzUs # TJ4ZrO8FePcz6na69jrsAdBFVjIizFk2Ml9h6+XLH7LEmQerqmTiDectrBGYRenS # VYLTPrU6Pew10CgCiiYV8zGBozVkEmeqVYtzSQtijjvM8ay7JGjp0Nk5bZW+J/I3 # PvHtG5rd0dj1YIymwq0jPtZnDZ4rH2Js4r17RmHElgOwzAyJ+o5v0CfkIOcGCMxm # 9Toj/IKi6BgnQqCwPu9rAHxZRUOJvmMwVUeIiRpImQolUs+jhoYSKPfyLBQwe9or # KbZHQLDCvxzdx4mtafm3TTcGu+lRwwJG0kgQqdWDxFA1tfwCISWvfPQA1p9ng0Ge # orzqhgovgFZZiE2NRKIhU3LSrsi6ZrS1loc0xxVMcNdR6wrPao/bnKE/gONEWAVC # Yvy1I8nrkAVe3lx13FuCUXfLWE0u2xRPWLxCsjEraqB4DA== # SIG # End signature block |