Aron Semle
Aron Semle
Ran into this issue as well. We upgraded from V1 to V2 and this broke for a few customers using the props to control the proxy. The work around is...
+1 to this. This relates to my request https://github.com/snowflakedb/snowflake-ingest-java/issues/802. I want control over when the commit occurs. As is the sending thread has a 1 second max delay, which seems...
Thanks @sfc-gh-tzhang I was able to implement this. That said, I think exposing a way to say "send now" OR a way to remove/control the 1 second thread that sends...
I found this, but it looks like it's no faster than 1 second? Is that right?
I'm having an issue here as well, I think. I'm using LLM Studio locally and enabled "OpenAI" mode on a llama model. Using the OpenAI SDK directly I can set...
@dliubarskyi disregard. The difference between using the OpenAI SDK directly vs. langchain4j is the payload. I think the change made here https://github.com/ai-for-java/openai4j/pull/47 forces the payload sent to LMStudio to include...
For a little more background, we're on a local network pushing to Snowflake. In the case of an error (cable pull or other) we generally disconnect and tear down all...
I think the bug is in SnowflakeStreamingIngestClientInternal.close(). It calls this.flush(true).get() which appears to never return? It seems like this should be this.flush(true).get(10, TimeUnit.SECONDS); or something with a reasonable wait time?