How about adding local kernel loading to `transformers.KernelConfig()`
Feature request
As title.
Motivation
Currently, the class KernelConfig() creates the kernel_mapping through the LayerRepository provided by huggingface/kernels. The LayerRepository downloads and loads kernel from the hub. I think adding the ability for it to load kernel locally should be very helpful for the debugging process.
Your contribution
huggingface/kernels already has LocalLayerRepository built in. Maybe we should consider adding it to KernelConfig().
cc @MekkCyber
Hi @zheliuyu ! Yes it totally makes sense, we can have a use_local_kernels flag inside KernelConfig to do that, do you want to open a PR for that ?
Hi @zheliuyu ! Yes it totally makes sense, we can have a
use_local_kernelsflag insideKernelConfigto do that, do you want to open a PR for that ?
Of course. Let's get started.
WIP on https://github.com/zheliuyu/transformers-kernels
https://github.com/huggingface/transformers/pull/42800 PR merged. Documentation will be updated accordingly, but this issue can be closed.
Thanks to everyone who followed this issue.❤