Emir Rencber
Emir Rencber
Pinging @jdubois for review after changes
@langchain4j do you need to approve as well, or can you merge it?
Initially, I shared your curiosity. Everything seemed in order at first glance, with sysout prints functioning as expected. However, upon closer examination, it became evident that the response was not...
> @emivoz I have just tried your app and [Streaming Chat](http://localhost:8080/streaming) works for me as expected: tokens are appearing one by one. Does it work differently for you? did you...
The exception will disappear as soon as you set tokenizer where you built AzureOpenAiStreamingChatModel `AzureOpenAiStreamingChatModel.builder().endpoint(...) .apiKey(...) .tokenizer(new OpenAiTokenizer(OpenAiChatModelName.GPT_3_5_TURBO.toString())) .deploymentName(...).build();` Also tested using OpenAIAsyncClient and OpenAIClient directly to test if it...
> @emivoz ok, got it, I will check it in the morning. great :) If you make use of refactored version of mine in that repo, you will see that...
pinging @langchain4j to review changes :)
great @langchain4j everything should be fixed now ;)
Hey @Paandaman did you find any solution for this?
This one is open, since it was planned for JAVA EE 8. Any plans, updates on this?