cedille-ai icon indicating copy to clipboard operation
cedille-ai copied to clipboard

Inference on GPU

Open Tahlor opened this issue 3 years ago • 0 comments

Is inference on a GPU feasible? If so, how much GPU memory would be needed? (16GB is apparently not enough)

Or is there a distilled or lighter-weight version?

Tahlor avatar Jun 26 '22 05:06 Tahlor