Chess_champion
Chess_champion
It seems the most important hyper parameter is missing from LightGBM synapse ML package. In Python the default is 100, but here we don't know the default and we have...
- PyTorch-Forecasting version: 1.0.0 - PyTorch version: 2.0.1 - Python version: 3.10 - Operating System: Windows ### Expected behavior My expectation is to get a prediction that is not completely...
Adding a **weight** argument to TimeSeriesDataSet will cause the trainer to throw an error for NhiTS because of a dimension mismatch in dimension 1. This is because of unsqueeze operation...
- PyTorch-Forecasting version:1.0.0 - PyTorch version: 2.0.0 - Python version: 3.10 ### Expected behavior I executed code to be able to handle **weight** paramter defined in TimeSeriesDataSet, to weight the...
tensorflow.python.framework.errors_impl.InvalidArgumentError: events.out.tfevents. Invalid argument
Hi I am using Pytorch forecasting package however my code throws an error in the training phase, even during the first epoch. Here is the system configuration: - PyTorch-Forecasting version:...
### What happened? I am currently trying to embed a series of documents with ChromadB and I add id, embedding and metadata which is the original text to the vector...
Hi I am trying to fine tune my llama model using DeepSpeed, accelerate and SFTTrainer along with QLORA. I have already pretrained my LLama model. During the pretraining, I used...
### Your current environment I performed GPTQ quantization on Qwen 72B instruct using AutoGPTQ package, with the following configuration: group_size = 32, desc_order= 32. Then I use the model inside...
### Your current environment Databricks VLLM version: 0.8.2 ### 🐛 Describe the bug I have been using VLLM for over 6 months with no problem, until recently which I started...
I train/fine-tune the Mistral-24B small model ([mistralai/Mistral-Small-24B-Instruct-2501](https://huggingface.co/mistralai/Mistral-Small-24B-Instruct-2501)) using deepspeed and accelerate and I saved the model using following commands: ``` if accelerator.is_main_process: model =accelerator.unwrap_model(trainer.model) model.save_pretrained(model_path) tokenizer.save_pretrained(model_path) ``` However when I...