Lou Bernardi
Lou Bernardi
Please correct me if I'm misunderstanding, but FastLLaMA looks less like a wrapper and more like a fork of LLaMA.cpp, following [pull #370](https://github.com/ggerganov/llama.cpp/pull/370). If that's in fact the case: everyone...
@PotatoSpudowski just for your information, ooba eventually moved forward with llama-cpp-python. Ooba this issue might be worth closing?
That's awesome! Thanks USB. I think you'll need to checkout PR 529 as Ooba mentioned, as the safetensor support hasn't been merged yet. > > On Mar. 24, 2023 at...
Good idea, had an idea to add a button to UI to save the preset values by a set name. I'll work on it next week and create a PR
> Looks suspicious to me. > > > no safetensors > broken twitter URL > no README Can we close this issue, @oobabooga ? For one, if this is just...
I'm not sure why I was tagged in this at all. That said, are you sure you're using the most recent commit? I don't have any experience with Docker, but...
Looks like there's a draft PR for this: https://github.com/oobabooga/text-generation-webui/pull/447
You may consider adding the "draft" label if you don't mean for this to be merged right away.
I'm experiencing the same bug, thanks for sending such a great write-up.
Hi everyone, I cannot provide any test data as I'm unsure what exactly is causing it, but I'm wondering if my project is suffering from this issue as well. For...