suraj-gade
suraj-gade
Hi, I also have same observation. The fastchat-t5-3b in Arena too model gives better much better responses compared to when I query the downloaded fastchat-t5-3b model. Please let us know,...
@merrymercy , @DachengLi1 Here are some Prompts and their responses generated through Chat Arena and offline inference. **Prompt 1:** ``` """ Given the context information below: --- prescription / certificate)...
> @suraj-gade Thanks! Can you provide a test script in the offline setting like @cvarrichio did? Did you manually provide prompts through tokenizers? Hi @DachengLi1, Thanks for the response. Here...