Trang Le

Results 77 comments of Trang Le

Wow June this is great! Thank you for this organization! I think you got everything.

Apologies for the delayed response: I'm on vacation until Feb 10. Thanks for submitting the issue @raffaem. You can submit a ticket to the OpenAlex team at https://openalex.org/feedback I agree...

Hi @wch, any chance you can add `proxy` and `deployment_name` as arguments to `chat_server`? There have been some issues regarding discrepancies between model and engine/deployment names on the Azure endpoint....

Thank you for the quick response! I installed the new chatstream version in the PR, added some openai configs (below) to my `app.py`, and uses `endpoint_type = "azure"` in `chat_server`,...

Actually no. Azure is very odd with their engine/deployment name and model name. See https://github.com/hwchase17/langchain/issues/1577#issuecomment-1479489910 and example: https://github.com/hwchase17/langchain/issues/3251#issuecomment-1566531223

Thanks, Winston! So this time, instead of erroring out immediately, the app just kinda stalls forever. And when I Ctrl C, I see: > Task exception was never retrieved future:...

Hmm, so I installed the new changes, but the app hangs after I enter a prompt. > openai.error.APIConnectionError: Error communicating with OpenAI Full message Task exception was never retrieved future:...

@nsvbhat Did you install the chatstream version on that PR? ``` pip install chatstream@git+https://github.com/wch/chatstream.git@azure ```

@nsvbhat oh I think `azure_deployment_id` is the argument you want, not `deployment_id`.

> @trangdata I see some references to a proxy in the stack trace, and I know you mentioned something about a proxy earlier. I don't see anything in the Azure...