codellama
codellama copied to clipboard
What are the GPU requirements for running the code llama models?
There is no information about prereqs of what GPU and memory that is requited for running the models during inference.
Please help.
Can somebody help with this? I am getting the same error!