ganyk
ganyk
@zeyugao Hi, i am trying to use your PR to run inference for LLAMA-7B and 65B, it worked well for LLAMA-7B. However, when i used llama-65b, i got this error...
@zeyugao Thanks for the repack script, I have successfully run inference for llama-65B after repacking the checkpoint files.
@xs1997zju same issue here, @zeyugao any ideas about what might be causing these differences?
Hi, can u provide more specific info about the problems you met?