Iambestfeed

Results 9 issues of Iambestfeed

```python (exllama) dungnt@symato:~/ext_hdd/repos/gau/exllama$ python test_benchmark_inference.py -d /home/dungnt/ext_hdd/repos/Nhan/GPTQ-for-LLaMa/checkpoints/open_llama_3b/ -v -ppl -- Perplexity: -- - Dataset: datasets/wikitext2_val_sample.jsonl -- - Chunks: 100 -- - Chunk size: 2048 -> 2048 -- - Chunk overlap:...

I had a passing idea about whether it is possible to use quantization for embedding models using Mistral? However, one problem is that currently the checkpoint lacks the ``lm-head`` part,...

I'm having some experiments with using the Decoder model for tasks using embedding. I have read your papers and they have given me a lot of insights to exploit. However,...

I observed that some datasets such as **CmedqaRetrieval, CMedQAv1, CMedQAv2** Built from QA datasets and converted to 'query-pos-neg' format. Do you have 1 instruction for building this data? QA dataset...

I did try running the code from your repository; however, when I attempted to add the --save_safetensors feature, the process was interrupted after performing the evaluation. I didn't encounter any...

Hey there, I was just wondering how this compared to SqueezeLLM, the perplexity/size seems on par. Here's their paper and repo: https://arxiv.org/abs/2306.03078 https://github.com/SqueezeAILab/SqueezeLLM Thank you!

I have logged the parameters of the training into wandb (you can see the [link](https://api.wandb.ai/links/nguyenducnhan-work/j586u538) below). ![image](https://github.com/staoxiao/RetroMAE/assets/95571916/41df1593-8032-45c8-8126-ff9ca8d7e0e8) This is my config: ```python pretrain.run --output_dir output_merge_data \ --report_to wandb \ --data_dir...

Now I have implemented Qlora for SFT and reward model but I am quite confused when I do Qlora for PPO, do you plan to integrate PPO into repo?

I tried run this example on your repo but i get output is None ![image](https://github.com/AlekseyKorshuk/chat-data-pipeline/assets/95571916/791d77ac-84b5-4796-9f04-332d2e6d802a) I really don't know what's going on, can you help me.