Ejafa
Ejafa
use my llama Ejafa/llama_7B , I recovered successfully
I think this is a problem with Pytorch. I solved it by installing PyTorch with cuda 11.8 (not 12). if you have cuda 12, the flash-attention package installing problem will...
I am working on fine-tuning. I will keep updated
model is not in hf fomat
@domeccleston It's a new trend now. Download 2/3 synthetic dataset + train on Llama = DIY CHADGPT
@domeccleston let me give you some of the resources for my thesis. https://vicuna.lmsys.org/ , https://github.com/nomic-ai/gpt4all, https://crfm.stanford.edu/2023/03/13/alpaca.html , we are all trying to test these researches and do evaluation