gamercoder153

Results 42 comments of gamercoder153

![515212 (1)](https://github.com/unslothai/unsloth/assets/168166085/8b963868-0395-4df4-91fb-9c14ae9c15bd) Iam facing this error with your colab notebook during inferencing: https://colab.research.google.com/drive/135ced7oHytdxu3N2DNe1Z0kqjyYIkDXp?usp=sharing

@mxtsai which model?

@danielhanchen in colab after finetuning

@KillerShoaib Man thanks a lot for fixing, I really appreciate that. Can u explain to me where to add that I am using Google Colab t4 GPU: https://colab.research.google.com/drive/135ced7oHytdxu3N2DNe1Z0kqjyYIkDXp?usp=sharing

great! thanks a lot guys @KillerShoaib @danielhanchen

@danielhanchen @KillerShoaib I checked it once again it literally the same ![78415 (1)](https://github.com/unslothai/unsloth/assets/168166085/5ae4bd07-0caf-43e4-91b1-bbc2f75703a4)

@KillerShoaib Iam using this colab notebook : https://colab.research.google.com/drive/135ced7oHytdxu3N2DNe1Z0kqjyYIkDXp?usp=sharing from their github I am using llama3-8b-instruct model : https://huggingface.co/unsloth/llama-3-8b-Instruct

@KillerShoaib No, Iam not using any older finetuning model. Let me try it once again

@KillerShoaib Its the same stuff dude!! just generates sometimes and goes on loop like this ![image](https://github.com/unslothai/unsloth/assets/168166085/c4b72317-e549-4d37-ab0c-ecff3acaa399) till 128 max new tokens and if text streaming is true it does the...

@hiyouga there is no other process