500 error with smartpaste in Blazor sample project
After setting DeploymentName and ApiKey in RepoSharedConfig.json I ran the project ExampleBlazorApp. Initial screen appears, go to smartpaste window. Copy sample text, click the smart paste button and get the following.
Stop then restart, try smart combobox, works as expected. Next traverse to Smart TextArea, type in some text and get 500 error same as smart paste.
@karenpayneoregon Can you check the console output for the server? It should give you the full server-side exception, which will very likely explain the underlying problem. If it's not obvious what's wrong from the exception message, please post the details of the exception here.
There's a good chance it's something to do with the endpoint/deployment/key config being incorrect but we can't be sure without seeing the exception message. Hope this helps!
[500] Error in SmartPasteButton :
@KodeKanna I'd like to help but unfortunately don't see any useful info in the screenshot. If you could post your config values (with the API key redacted, of course) it might clarify the problem.
@KodeKanna Can I check: are you using OpenAI or Azure OpenAI, or something else?
- If you're using OpenAI, please ensure you do not configure any
Endpoint. Remove theEndpointline from config completely. - If you're using Azure OpenAI, please ensure that you do configure a value for
Endpoint, and it has to be of the formhttps://SOMETHING.openai.azure.com/, whereSOMETHINGis the value given to you when you set up Azure OpenAI in the portal.
My guess is perhaps you've set Endpoint to something else, for example the URL of your own web app, which would be incorrect.
Remember to set your DeploymentName to the name of your model
"SmartComponents": { "ApiKey": "xxxxxxxxxxxxxxxxxxxxxxxxxxx", "DeploymentName": "Name of your model (not gpt-3.5-turbo or something else)", "Endpoint": "https://Your service.openai.azure.com/" }
@mortenfa Thanks for answering!
You're correct if they are using Azure OpenAI. However if they are using OpenAI, the deployment name is the model name (e.g., gpt-3.5-turbo).
@karenpayneoregon Can you check the console output for the server? It should give you the full server-side exception, which will very likely explain the underlying problem. If it's not obvious what's wrong from the exception message, please post the details of the exception here.
There's a good chance it's something to do with the endpoint/deployment/key config being incorrect but we can't be sure without seeing the exception message. Hope this helps!
Here is the console output:
info: Microsoft.Hosting.Lifetime[14]
Now listening on: https://localhost:7117
info: Microsoft.Hosting.Lifetime[14]
Now listening on: http://localhost:5187
info: Microsoft.Hosting.Lifetime[0]
Application started. Press Ctrl+C to shut down.
info: Microsoft.Hosting.Lifetime[0]
Hosting environment: Development
info: Microsoft.Hosting.Lifetime[0]
Content root path: C:\OED\DotnetLand\VS2022\smartcomponents\samples\ExampleBlazorApp
fail: Microsoft.AspNetCore.Diagnostics.DeveloperExceptionPageMiddleware[1]
An unhandled exception has occurred while executing the request.
Azure.RequestFailedException: The model `eR123745` does not exist or you do not have access to it.
Status: 404 (Not Found)
ErrorCode: model_not_found
Content:
{
"error": {
"message": "The model `eR123745` does not exist or you do not have access to it.",
"type": "invalid_request_error",
"param": null,
"code": "model_not_found"
}
}
Headers:
Date: Fri, 22 Mar 2024 15:12:43 GMT
Connection: keep-alive
Vary: REDACTED
X-Request-ID: REDACTED
Strict-Transport-Security: REDACTED
CF-Cache-Status: REDACTED
Set-Cookie: REDACTED
Server: cloudflare
CF-RAY: REDACTED
Alt-Svc: REDACTED
Content-Type: application/json; charset=utf-8
Content-Length: 214
at Azure.Core.HttpPipelineExtensions.ProcessMessageAsync(HttpPipeline pipeline, HttpMessage message, RequestContext requestContext, CancellationToken cancellationToken)
at Azure.AI.OpenAI.OpenAIClient.GetChatCompletionsAsync(ChatCompletionsOptions chatCompletionsOptions, CancellationToken cancellationToken)
at SmartComponents.Inference.OpenAI.OpenAIInferenceBackend.GetChatResponseAsync(ChatParameters options)
at SmartComponents.Inference.SmartPasteInference.GetFormCompletionsAsync(IInferenceBackend inferenceBackend, SmartPasteRequestData requestData)
at Microsoft.AspNetCore.Builder.SmartComponentsServiceCollectionExtensions.AttachSmartComponentsEndpointsStartupFilter.<>c.<<Configure>b__0_2>d.MoveNext()
--- End of stack trace from previous location ---
at Microsoft.AspNetCore.Http.RequestDelegateFactory.ExecuteTaskResult[T](Task`1 task, HttpContext httpContext)
at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)
at Microsoft.AspNetCore.Diagnostics.DeveloperExceptionPageMiddlewareImpl.Invoke(HttpContext context)
Note I did not set Endpoint as I did not know what to put there.
{
"SmartComponents": {
// TODO: Add values here to run samples in this repo
// Do not commit your changes to source control
"DeploymentName": "eR123745",
//"Endpoint": "",
"ApiKey": "sk-redacted"
}
}
Note I did not set Endpoint as I did not know what to put there.
That means you must be using OpenAI, and not Azure OpenAI, right?
If that's the case, then the DeploymentName must be the name of an OpenAI-defined model, such as gpt-3.5-turbo. The value eR123745 can't be correct unless you're using Azure OpenAI, in which case you should have an endpoint given to you by the Azure Portal.
All I did was clone the repository, opened an account at https://platform.openai.com/ and thought I had from the sub pages had the right information but seems I do not.
My model name came from https://platform.openai.com/api-keys
My deployment name came from https://platform.openai.com/account/organization
Above information was entered into the config file.
In regards to Azure portal, do not have a login.
Please try changing your DeploymentName config value to one of the OpenAI-supported models. For example, set its value to gpt-3.5-turbo instead of eR123745.
That worked, in regards to changing the DeploymentName. Just wanted to recommend the rate limit be longer as I was getting rate limit exceeded after three partial tries on the smart combobox. By doing this myself and others can evaluate this better and also provide feedback.
Thanks for letting us know. I'm confused about the rate limit comment though, as there is no rate limit on the smart combobox.