KoboldAI icon indicating copy to clipboard operation
KoboldAI copied to clipboard

Request for T5 gptq model support.

Open sigmareaver opened this issue 2 years ago • 1 comments

I attempted to load up flan-ul2 4-bit 128g gptq, but it looks like T5ForConditionalGeneration isn't supported, or perhaps Encoder/Decoder type LLMs in general. In particular, https://github.com/qwopqwop200/transformers-t5 would also likely be needed to provide support for quantized T5.

sigmareaver avatar Jun 25 '23 21:06 sigmareaver

If possible, I could do a pull request rather than burden you with a feature request. If you could let me know what files/functions I should look at to add support for a new model type, since I'm not familiar with KoboldAI's codebase, that should be enough to get me started.

sigmareaver avatar Jun 28 '23 19:06 sigmareaver