Varun Mathur
Varun Mathur
@hsuyab can you share with me the outline of the classes to change to implement this functionality. I think asking the contributors will be a better choice.
Is it possible to implement regression from a specific class of huggingface transformers? What should the outline of the classes to change to implement this as a PR?
@hsuyab were you able to open a PR to add this functionality?
@hsuyab Do you think it is necessary to implement this functionality for now? Would really like your comments on the classes to implemented for this?
@TerryCE Can I work on this PR?
@khaled-wsa plotly == 5.12 works.
@TheFaheem I agree with you ran into the same issue.
@TheFaheem Thanks I got it to work in a kaggle notebook.
I have followed the code given in the huggingface docs: ``` device_map = { "transformer.word_embeddings": 0, "transformer.word_embeddings_layernorm": 0, "lm_head": "cpu", "transformer.h": 0, "transformer.ln_f": 0, } quantization_config = BitsAndBytesConfig(llm_int8_enable_fp32_cpu_offload=True) model =...
@julian-risch It would greatly help if you can suggest a way to tackle the mypy issue. Thanks for your help.