lix2k3
lix2k3
Is open-interpreter integration on the roadmap?
can confirm this continues to happen. It infinitely sends the message over and over again to no end.
This is exactly my experience. Significantly better responses on http://chat.lmsys.com/ when compared to the downloaded huggingface.com model with the same exact prompt. It would be great to understand what the...
I looked at the github and got no insight on how to reformat the prompts so that the output from the off-line model and the fast chat website are similar...
The same problem is observed with the cli. I notice it is somewhat better than what I got from the offline model in python but still some repetition that the...
Thank you Jason. I think that is exactly the code we were provided before that I had trouble translating into practice. Translating it, does it mean the following should be...
Thanks for your response, Jason. I believe everyone who has posted their examples thus far and having trouble is not using the github code provided. Rather, we are manually trying...
> @lix2k3 If you type in "hi", the whole prompt t5 sees is: > > A chat between a curious human and an artificial intelligence assistant. The assistant gives helpful,...
Thank you all for your help but I can confirm that with the identical prompts now, given what Dacheng Li posted, the responses are very different. Perhaps, they are even...