M. Cihat Ünal

Results 6 comments of M. Cihat Ünal

I've found that the problem here is Quantization: ``` import torch from transformers import BitsAndBytesConfig config = BitsAndBytesConfig( load_in_4bit=True, bnb_4bit_quant_type="nf4", # llm_int8_has_fp16_weight = True, bnb_4bit_use_double_quant=True, bnb_4bit_compute_dtype=torch.bfloat16, ) model = AutoModelForCausalLM.from_pretrained(model_name,...

I tried every options in README, but in every case I encountered with `IndexError: too many indices for array: array is 1-dimensional, but 2 were indexed`. As it seems that...

@utility-aagrawal Hello there, and sorry for the late answer. I think your problem would be solved if you use [0,1,2] as target values for [neutral, positive, negative] instead of using...

@utility-aagrawal `the more the better` in terms of data samples for each class. However, you need to do bunch of experiments, cause it depends on your data quality, data samples,...

Yes, I got the similar results when I tried this model for Turkish. I think the model doesn't work for other languages, or we're doing a mistake in tokenization part

@IDoMathEveryDay Thanks for the help. Could you please share the code from the beginning ? I want to see how you just download (or load) the data at the first...