lkk

Results 26 comments of lkk

> Hi, and thanks for trying out Llama Guard! > > I tried your above policy with the model for reproducing the behavior. Here is my prompt: > > ```...

hi, 1. For nvidia gpu, you don't need install optimum-habana, because the code will check 'is_optimum_habana_available()' for habana device. So you can uninstall this package and don't need set "--use_habana"...

hi, xinyu @XinyuYe-Intel I have add some code patch https://github.com/XinyuYe-Intel/optimum-habana/pull/1 for this starcoder pr, please review~ the changes: - replace `self._get_generation_mode` to `generation_config.get_generation_mode` because of transformers>=4.39.0 - keep transformers>=4.38.2 `StoppingCriteriaList`,...

hi @yao-matrix @libinta there is the comparison of inference (Gaudi2 and A100) ### Inference/generation performance (Gaudi2 and A100) #### single card Gaudi2 `python run_generation.py --model_name_or_path bigcode/starcoder2-3b --use_kv_cache --max_new_tokens 100 --bf16...

hi @yao-matrix @libinta there is the comparison of training (Gaudi2 and A100) ### validate training (Gaudi2 and A100) - bigcode/starcoder2-3b, single card Gaudi2, lora ![image](https://github.com/huggingface/optimum-habana/assets/33276950/995a0be2-47dc-4841-bf95-fd4be573c46b) - bigcode/starcoder2-7b, single card Gaudi2,...

hi, @libinta can you help review this pr? Thanks~

> @lkk12014402 , pls 1) provide performance and convergence comparisons btw A100 and Gaudi2 2) pls add ci, thx. yes, I will add these soon.

### validated on Gaudi2 - mixtral-8*7B trl-sft with 8 cards gaudi2, deepspeed zero3 ![image](https://github.com/huggingface/optimum-habana/assets/33276950/6a1b129d-fe07-43ba-9ff7-9b58928451d2) - mixtral-8*7B trl-dpo with 8 cards gaudi2, deepspeed zero3 ![image](https://github.com/huggingface/optimum-habana/assets/33276950/d97c18b2-06f5-4ce6-a73a-8e680778e02c) - mistral-7B trl-sft with 8 cards...

hi @regisss can we add these examples as MoE model is popular?

> @lkk12014402 how is the GPU performance comparison? hi @libinta I will update these as soon as possible