Vanilla
Vanilla
大佬们,我有个问题:旋转框在回归时的角度会出现临界偏差,那这个问题是否可以在将5参数格式(cx,cy,w,h,theta)转化为DOTA4点格式时通过改变不同角度范围下第一个点(x1,y1)的位置来解决嘞? I have a question: the angle of the rotated bounding box at regression will have a critical deviation, so can this problem be solved by changing the position of...
I notice the code in BAM is : ` def forward(self, x): b, c, _, _ = x.size() sa_out=self.sa(x) ca_out=self.ca(x) weight = self.sigmoid(sa_out*ca_out) weight=self.sigmoid(sa_out+ca_out) # here out=(1+weight)*x return out` here...
I download pretrained weight and use predict.py to test some images, but meet this bug, what's the problem of the fuse_layers? ` File "test4/Road/LoveDA-master/Semantic_Segmentation/module/baseline/base_hrnet/_hrnet.py", line 394, in forward y =...
I'm interested in learning how to use the operators in e2cnn/escnn to implement a function f such that f(Fea_I) = Fea_I_rot. Here, Fea_I = B(I) represents the feature of an...
**Environment** ( I have tried pytorch 1.9.0 and pytorch 1.8.0 and their corresponding cudnn/ mmcv-full =1.14.0, both will cause memory increasing when training.) > sys.platform: linux Python: 3.8.17 (default, Jul...
I try to install alpharotate in my virtualmachine, but a mistake happens like this:  Have anyone who can help me pls!!
I am conducting the instruction tuning of llama3_llava using the script on my own dataset `NPROC_PER_NODE=${GPU_NUM} xtuner train llava_llama3_8b_instruct_full_clip_vit_large_p14_336_lora_e1_gpu8_finetune --deepspeed deepspeed_zero3_offload --seed 1024`. After the following output, the program stops...
Hello, I load pre-trained llava-llama3 SFT weights and fine-tune using LoRA, but get an error when merging weights: **scripts:** Training: ``` deepspeed --master_port=$((RANDOM + 10000)) --include localhost:0,1,2,3 llava/train/train_mem.py \ --lora_enable...
Hello, I would like to know if it's necessary to use CUDA version 12.x during training? Does this imply that training LLaVA and LLaVA-HR requires reinstalling different CUDA versions? When...
Could you please guide me on how to convert a ".pt" format model into the Hugging Face format (similar to the one at https://huggingface.co/laion/CLIP-ViT-g-14-laion2B-s12B-b42K/tree/main? It seems to require a JSON...