InternVL-MMDetSeg icon indicating copy to clipboard operation
InternVL-MMDetSeg copied to clipboard

AssertionError:assert qkv.dtype in [torch.float16, torch.bfloat16]

Open studiouspan opened this issue 8 months ago • 0 comments

Thank you for your work on this excellent model and providing the example code.

When evaluate the InternViT-6B-224px with a single GPU for semantic segmentation on the ADE20K dataset, the following error occurred: AssertionError:assert qkv.dtype in [torch.float16, torch.bfloat16]

To help resolve it, could you kindly clarify potential causes? Here are the key points I suspect based on the task requirements: use linear probing on InternViT-6B-224px (frozen) : link config ckpt


Environment info: sys.platform: linux Python: 3.9.21 (main, Dec 11 2024, 16:24:11) [GCC 11.2.0] CUDA available: True GPU 0: NVIDIA RTX A6000 CUDA_HOME: /usr/local/cuda-11.3 NVCC: Cuda compilation tools, release 11.3, V11.3.109 GCC: gcc (Ubuntu 9.4.0-1ubuntu1~20.04.2) 9.4.0 PyTorch: 1.12.0 PyTorch compiling details: PyTorch built with:

  • GCC 9.3
  • C++ Version: 201402
  • Intel(R) oneAPI Math Kernel Library Version 2023.1-Product Build 20230303 for Intel(R) 64 architecture applications
  • Intel(R) MKL-DNN v2.6.0 (Git Hash 52b5f107dd9cf10910aaa19cb47f3abf9b349815)
  • OpenMP 201511 (a.k.a. OpenMP 4.5)
  • LAPACK is enabled (usually provided by MKL)
  • NNPACK is enabled
  • CPU capability usage: AVX2
  • CUDA Runtime 11.3
  • NVCC architecture flags: -gencode;arch=compute_37,code=sm_37;-gencode;arch=compute_50,code=sm_50;-gencode;arch=compute_60,code=sm_60;-gencode;arch=compute_61,code=sm_61;-gencode;arch=compute_70,code=sm_70;-gencode;arch=compute_75,code=sm_75;-gencode;arch=compute_80,code=sm_80;-gencode;arch=compute_86,code=sm_86;-gencode;arch=compute_37,code=compute_37
  • CuDNN 8.3.2 (built against CUDA 11.5)
  • Magma 2.5.2
  • Build settings: BLAS_INFO=mkl, BUILD_TYPE=Release, CUDA_VERSION=11.3, CUDNN_VERSION=8.3.2, CXX_COMPILER=/opt/rh/devtoolset-9/root/usr/bin/c++, CXX_FLAGS= -Wno-deprecated -fvisibility-inlines-hidden -DUSE_PTHREADPOOL -fopenmp -DNDEBUG -DUSE_KINETO -DUSE_FBGEMM -DUSE_QNNPACK -DUSE_PYTORCH_QNNPACK -DUSE_XNNPACK -DSYMBOLICATE_MOBILE_DEBUG_HANDLE -DEDGE_PROFILER_USE_KINETO -O2 -fPIC -Wno-narrowing -Wall -Wextra -Werror=return-type -Wno-missing-field-initializers -Wno-type-limits -Wno-array-bounds -Wno-unknown-pragmas -Wno-unused-parameter -Wno-unused-function -Wno-unused-result -Wno-unused-local-typedefs -Wno-strict-overflow -Wno-strict-aliasing -Wno-error=deprecated-declarations -Wno-stringop-overflow -Wno-psabi -Wno-error=pedantic -Wno-error=redundant-decls -Wno-error=old-style-cast -fdiagnostics-color=always -faligned-new -Wno-unused-but-set-variable -Wno-maybe-uninitialized -fno-math-errno -fno-trapping-math -Werror=format -Werror=cast-function-type -Wno-stringop-overflow, LAPACK_INFO=mkl, PERF_WITH_AVX=1, PERF_WITH_AVX2=1, PERF_WITH_AVX512=1, TORCH_VERSION=1.12.0, USE_CUDA=ON, USE_CUDNN=ON, USE_EXCEPTION_PTR=1, USE_GFLAGS=OFF, USE_GLOG=OFF, USE_MKL=ON, USE_MKLDNN=OFF, USE_MPI=OFF, USE_NCCL=ON, USE_NNPACK=ON, USE_OPENMP=ON, USE_ROCM=OFF,

TorchVision: 0.13.0 OpenCV: 4.11.0 MMCV: 1.7.0 MMCV Compiler: GCC 9.4 MMCV CUDA Compiler: 11.3 MMSegmentation: 0.27.0+89b1e61


Command: python tools/test.py /home/hello/PX/InternVL-MMDetSeg/mmsegmentation/configs/intern_vit_6b/linear_probing/linear_intern_vit_6b_504_80k_ade20k_bs16_lr4e-5_frozen.py /home/hello/PX/InternVL-MMDetSeg/pretrained/linear_intern_vit_6b_504_80k_ade20k_bs16_lr4e-5_frozen.pth --show-dir 0530 --eval mIoU


Log: /home/hello/PX/InternVL-MMDetSeg/mmcv/mmcv/init.py:20: UserWarning: On January 1, 2023, MMCV will release v2.0.0, in which it will remove components related to the training process and add a data transformation module. In addition, it will rename the package names mmcv to mmcv-lite and mmcv-full to mmcv. See https://github.com/open-mmlab/mmcv/blob/master/docs/en/compatibility.md for more details. warnings.warn( 2025-06-03 10:26:08,109 - mmseg - INFO - Multi-processing start method is None 2025-06-03 10:26:08,109 - mmseg - INFO - OpenCV num_threads is `32 2025-06-03 10:26:08,128 - mmseg - INFO - Loaded 2000 images 2025-06-03 10:26:35,727 - mmseg - INFO - _IncompatibleKeys(missing_keys=[], unexpected_keys=['clip_projector.norm1_q.weight', 'clip_projector.norm1_q.bias', 'clip_projector.norm1_k.weight', 'clip_projector.norm1_k.bias', 'clip_projector.norm1_v.weight', 'clip_projector.norm1_v.bias', 'clip_projector.cross_attn.q_bias', 'clip_projector.cross_attn.k_bias', 'clip_projector.cross_attn.v_bias', 'clip_projector.cross_attn.q.weight', 'clip_projector.cross_attn.k.weight', 'clip_projector.cross_attn.v.weight', 'clip_projector.cross_attn.proj.weight', 'clip_projector.cross_attn.proj.bias']) INFO:mmseg:_IncompatibleKeys(missing_keys=[], unexpected_keys=['clip_projector.norm1_q.weight', 'clip_projector.norm1_q.bias', 'clip_projector.norm1_k.weight', 'clip_projector.norm1_k.bias', 'clip_projector.norm1_v.weight', 'clip_projector.norm1_v.bias', 'clip_projector.cross_attn.q_bias', 'clip_projector.cross_attn.k_bias', 'clip_projector.cross_attn.v_bias', 'clip_projector.cross_attn.q.weight', 'clip_projector.cross_attn.k.weight', 'clip_projector.cross_attn.v.weight', 'clip_projector.cross_attn.proj.weight', 'clip_projector.cross_attn.proj.bias']) /home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/losses/cross_entropy_loss.py:235: UserWarning: Default avg_non_ignore is False, if you would like to ignore the certain label and average loss over non-ignore labels, which is the same with PyTorch official cross_entropy, set avg_non_ignore=True. warnings.warn( load checkpoint from local path: /home/hello/PX/InternVL-MMDetSeg/pretrained/linear_intern_vit_6b_504_80k_ade20k_bs16_lr4e-5_frozen.pth The model and loaded state dict do not match exactly

missing keys in source state_dict: backbone.pos_embed, backbone.cls_token, backbone.patch_embed.proj.weight, backbone.patch_embed.proj.bias, backbone.blocks.0.norm1.weight, backbone.blocks.0.attn.qkv.weight, backbone.blocks.0.attn.proj.weight, backbone.blocks.0.attn.proj.bias, backbone.blocks.0.attn.q_norm.weight, backbone.blocks.0.attn.k_norm.weight, backbone.blocks.0.ls1.gamma, backbone.blocks.0.norm2.weight, backbone.blocks.0.mlp.fc1.weight, backbone.blocks.0.mlp.fc1.bias, backbone.blocks.0.mlp.fc2.weight, backbone.blocks.0.mlp.fc2.bias, backbone.blocks.0.ls2.gamma, backbone.blocks.1.norm1.weight, backbone.blocks.1.attn.qkv.weight, backbone.blocks.1.attn.proj.weight, backbone.blocks.1.attn.proj.bias, backbone.blocks.1.attn.q_norm.weight, backbone.blocks.1.attn.k_norm.weight, backbone.blocks.1.ls1.gamma, backbone.blocks.1.norm2.weight, backbone.blocks.1.mlp.fc1.weight, backbone.blocks.1.mlp.fc1.bias, backbone.blocks.1.mlp.fc2.weight, backbone.blocks.1.mlp.fc2.bias, backbone.blocks.1.ls2.gamma, backbone.blocks.2.norm1.weight, backbone.blocks.2.attn.qkv.weight, backbone.blocks.2.attn.proj.weight, backbone.blocks.2.attn.proj.bias, backbone.blocks.2.attn.q_norm.weight, backbone.blocks.2.attn.k_norm.weight, backbone.blocks.2.ls1.gamma, backbone.blocks.2.norm2.weight, backbone.blocks.2.mlp.fc1.weight, backbone.blocks.2.mlp.fc1.bias, backbone.blocks.2.mlp.fc2.weight, backbone.blocks.2.mlp.fc2.bias, backbone.blocks.2.ls2.gamma, backbone.blocks.3.norm1.weight, backbone.blocks.3.attn.qkv.weight, backbone.blocks.3.attn.proj.weight, backbone.blocks.3.attn.proj.bias, backbone.blocks.3.attn.q_norm.weight, backbone.blocks.3.attn.k_norm.weight, backbone.blocks.3.ls1.gamma, backbone.blocks.3.norm2.weight, backbone.blocks.3.mlp.fc1.weight, backbone.blocks.3.mlp.fc1.bias, backbone.blocks.3.mlp.fc2.weight, backbone.blocks.3.mlp.fc2.bias, backbone.blocks.3.ls2.gamma, backbone.blocks.4.norm1.weight, backbone.blocks.4.attn.qkv.weight, backbone.blocks.4.attn.proj.weight, backbone.blocks.4.attn.proj.bias, backbone.blocks.4.attn.q_norm.weight, backbone.blocks.4.attn.k_norm.weight, backbone.blocks.4.ls1.gamma, backbone.blocks.4.norm2.weight, backbone.blocks.4.mlp.fc1.weight, backbone.blocks.4.mlp.fc1.bias, backbone.blocks.4.mlp.fc2.weight, backbone.blocks.4.mlp.fc2.bias, backbone.blocks.4.ls2.gamma, backbone.blocks.5.norm1.weight, backbone.blocks.5.attn.qkv.weight, backbone.blocks.5.attn.proj.weight, backbone.blocks.5.attn.proj.bias, backbone.blocks.5.attn.q_norm.weight, backbone.blocks.5.attn.k_norm.weight, backbone.blocks.5.ls1.gamma, backbone.blocks.5.norm2.weight, backbone.blocks.5.mlp.fc1.weight, backbone.blocks.5.mlp.fc1.bias, backbone.blocks.5.mlp.fc2.weight, backbone.blocks.5.mlp.fc2.bias, backbone.blocks.5.ls2.gamma, backbone.blocks.6.norm1.weight, backbone.blocks.6.attn.qkv.weight, backbone.blocks.6.attn.proj.weight, backbone.blocks.6.attn.proj.bias, backbone.blocks.6.attn.q_norm.weight, backbone.blocks.6.attn.k_norm.weight, backbone.blocks.6.ls1.gamma, backbone.blocks.6.norm2.weight, backbone.blocks.6.mlp.fc1.weight, backbone.blocks.6.mlp.fc1.bias, backbone.blocks.6.mlp.fc2.weight, backbone.blocks.6.mlp.fc2.bias, backbone.blocks.6.ls2.gamma, backbone.blocks.7.norm1.weight, backbone.blocks.7.attn.qkv.weight, backbone.blocks.7.attn.proj.weight, backbone.blocks.7.attn.proj.bias, backbone.blocks.7.attn.q_norm.weight, backbone.blocks.7.attn.k_norm.weight, backbone.blocks.7.ls1.gamma, backbone.blocks.7.norm2.weight, backbone.blocks.7.mlp.fc1.weight, backbone.blocks.7.mlp.fc1.bias, backbone.blocks.7.mlp.fc2.weight, backbone.blocks.7.mlp.fc2.bias, backbone.blocks.7.ls2.gamma, backbone.blocks.8.norm1.weight, backbone.blocks.8.attn.qkv.weight, backbone.blocks.8.attn.proj.weight, backbone.blocks.8.attn.proj.bias, backbone.blocks.8.attn.q_norm.weight, backbone.blocks.8.attn.k_norm.weight, backbone.blocks.8.ls1.gamma, backbone.blocks.8.norm2.weight, backbone.blocks.8.mlp.fc1.weight, backbone.blocks.8.mlp.fc1.bias, backbone.blocks.8.mlp.fc2.weight, backbone.blocks.8.mlp.fc2.bias, backbone.blocks.8.ls2.gamma, backbone.blocks.9.norm1.weight, backbone.blocks.9.attn.qkv.weight, backbone.blocks.9.attn.proj.weight, backbone.blocks.9.attn.proj.bias, backbone.blocks.9.attn.q_norm.weight, backbone.blocks.9.attn.k_norm.weight, backbone.blocks.9.ls1.gamma, backbone.blocks.9.norm2.weight, backbone.blocks.9.mlp.fc1.weight, backbone.blocks.9.mlp.fc1.bias, backbone.blocks.9.mlp.fc2.weight, backbone.blocks.9.mlp.fc2.bias, backbone.blocks.9.ls2.gamma, backbone.blocks.10.norm1.weight, backbone.blocks.10.attn.qkv.weight, backbone.blocks.10.attn.proj.weight, backbone.blocks.10.attn.proj.bias, backbone.blocks.10.attn.q_norm.weight, backbone.blocks.10.attn.k_norm.weight, backbone.blocks.10.ls1.gamma, backbone.blocks.10.norm2.weight, backbone.blocks.10.mlp.fc1.weight, backbone.blocks.10.mlp.fc1.bias, backbone.blocks.10.mlp.fc2.weight, backbone.blocks.10.mlp.fc2.bias, backbone.blocks.10.ls2.gamma, backbone.blocks.11.norm1.weight, backbone.blocks.11.attn.qkv.weight, backbone.blocks.11.attn.proj.weight, backbone.blocks.11.attn.proj.bias, backbone.blocks.11.attn.q_norm.weight, backbone.blocks.11.attn.k_norm.weight, backbone.blocks.11.ls1.gamma, backbone.blocks.11.norm2.weight, backbone.blocks.11.mlp.fc1.weight, backbone.blocks.11.mlp.fc1.bias, backbone.blocks.11.mlp.fc2.weight, backbone.blocks.11.mlp.fc2.bias, backbone.blocks.11.ls2.gamma, backbone.blocks.12.norm1.weight, backbone.blocks.12.attn.qkv.weight, backbone.blocks.12.attn.proj.weight, backbone.blocks.12.attn.proj.bias, backbone.blocks.12.attn.q_norm.weight, backbone.blocks.12.attn.k_norm.weight, backbone.blocks.12.ls1.gamma, backbone.blocks.12.norm2.weight, backbone.blocks.12.mlp.fc1.weight, backbone.blocks.12.mlp.fc1.bias, backbone.blocks.12.mlp.fc2.weight, backbone.blocks.12.mlp.fc2.bias, backbone.blocks.12.ls2.gamma, backbone.blocks.13.norm1.weight, backbone.blocks.13.attn.qkv.weight, backbone.blocks.13.attn.proj.weight, backbone.blocks.13.attn.proj.bias, backbone.blocks.13.attn.q_norm.weight, backbone.blocks.13.attn.k_norm.weight, backbone.blocks.13.ls1.gamma, backbone.blocks.13.norm2.weight, backbone.blocks.13.mlp.fc1.weight, backbone.blocks.13.mlp.fc1.bias, backbone.blocks.13.mlp.fc2.weight, backbone.blocks.13.mlp.fc2.bias, backbone.blocks.13.ls2.gamma, backbone.blocks.14.norm1.weight, backbone.blocks.14.attn.qkv.weight, backbone.blocks.14.attn.proj.weight, backbone.blocks.14.attn.proj.bias, backbone.blocks.14.attn.q_norm.weight, backbone.blocks.14.attn.k_norm.weight, backbone.blocks.14.ls1.gamma, backbone.blocks.14.norm2.weight, backbone.blocks.14.mlp.fc1.weight, backbone.blocks.14.mlp.fc1.bias, backbone.blocks.14.mlp.fc2.weight, backbone.blocks.14.mlp.fc2.bias, backbone.blocks.14.ls2.gamma, backbone.blocks.15.norm1.weight, backbone.blocks.15.attn.qkv.weight, backbone.blocks.15.attn.proj.weight, backbone.blocks.15.attn.proj.bias, backbone.blocks.15.attn.q_norm.weight, backbone.blocks.15.attn.k_norm.weight, backbone.blocks.15.ls1.gamma, backbone.blocks.15.norm2.weight, backbone.blocks.15.mlp.fc1.weight, backbone.blocks.15.mlp.fc1.bias, backbone.blocks.15.mlp.fc2.weight, backbone.blocks.15.mlp.fc2.bias, backbone.blocks.15.ls2.gamma, backbone.blocks.16.norm1.weight, backbone.blocks.16.attn.qkv.weight, backbone.blocks.16.attn.proj.weight, backbone.blocks.16.attn.proj.bias, backbone.blocks.16.attn.q_norm.weight, backbone.blocks.16.attn.k_norm.weight, backbone.blocks.16.ls1.gamma, backbone.blocks.16.norm2.weight, backbone.blocks.16.mlp.fc1.weight, backbone.blocks.16.mlp.fc1.bias, backbone.blocks.16.mlp.fc2.weight, backbone.blocks.16.mlp.fc2.bias, backbone.blocks.16.ls2.gamma, backbone.blocks.17.norm1.weight, backbone.blocks.17.attn.qkv.weight, backbone.blocks.17.attn.proj.weight, backbone.blocks.17.attn.proj.bias, backbone.blocks.17.attn.q_norm.weight, backbone.blocks.17.attn.k_norm.weight, backbone.blocks.17.ls1.gamma, backbone.blocks.17.norm2.weight, backbone.blocks.17.mlp.fc1.weight, backbone.blocks.17.mlp.fc1.bias, backbone.blocks.17.mlp.fc2.weight, backbone.blocks.17.mlp.fc2.bias, backbone.blocks.17.ls2.gamma, backbone.blocks.18.norm1.weight, backbone.blocks.18.attn.qkv.weight, backbone.blocks.18.attn.proj.weight, backbone.blocks.18.attn.proj.bias, backbone.blocks.18.attn.q_norm.weight, backbone.blocks.18.attn.k_norm.weight, backbone.blocks.18.ls1.gamma, backbone.blocks.18.norm2.weight, backbone.blocks.18.mlp.fc1.weight, backbone.blocks.18.mlp.fc1.bias, backbone.blocks.18.mlp.fc2.weight, backbone.blocks.18.mlp.fc2.bias, backbone.blocks.18.ls2.gamma, backbone.blocks.19.norm1.weight, backbone.blocks.19.attn.qkv.weight, backbone.blocks.19.attn.proj.weight, backbone.blocks.19.attn.proj.bias, backbone.blocks.19.attn.q_norm.weight, backbone.blocks.19.attn.k_norm.weight, backbone.blocks.19.ls1.gamma, backbone.blocks.19.norm2.weight, backbone.blocks.19.mlp.fc1.weight, backbone.blocks.19.mlp.fc1.bias, backbone.blocks.19.mlp.fc2.weight, backbone.blocks.19.mlp.fc2.bias, backbone.blocks.19.ls2.gamma, backbone.blocks.20.norm1.weight, backbone.blocks.20.attn.qkv.weight, backbone.blocks.20.attn.proj.weight, backbone.blocks.20.attn.proj.bias, backbone.blocks.20.attn.q_norm.weight, backbone.blocks.20.attn.k_norm.weight, backbone.blocks.20.ls1.gamma, backbone.blocks.20.norm2.weight, backbone.blocks.20.mlp.fc1.weight, backbone.blocks.20.mlp.fc1.bias, backbone.blocks.20.mlp.fc2.weight, backbone.blocks.20.mlp.fc2.bias, backbone.blocks.20.ls2.gamma, backbone.blocks.21.norm1.weight, backbone.blocks.21.attn.qkv.weight, backbone.blocks.21.attn.proj.weight, backbone.blocks.21.attn.proj.bias, backbone.blocks.21.attn.q_norm.weight, backbone.blocks.21.attn.k_norm.weight, backbone.blocks.21.ls1.gamma, backbone.blocks.21.norm2.weight, backbone.blocks.21.mlp.fc1.weight, backbone.blocks.21.mlp.fc1.bias, backbone.blocks.21.mlp.fc2.weight, backbone.blocks.21.mlp.fc2.bias, backbone.blocks.21.ls2.gamma, backbone.blocks.22.norm1.weight, backbone.blocks.22.attn.qkv.weight, backbone.blocks.22.attn.proj.weight, backbone.blocks.22.attn.proj.bias, backbone.blocks.22.attn.q_norm.weight, backbone.blocks.22.attn.k_norm.weight, backbone.blocks.22.ls1.gamma, backbone.blocks.22.norm2.weight, backbone.blocks.22.mlp.fc1.weight, backbone.blocks.22.mlp.fc1.bias, backbone.blocks.22.mlp.fc2.weight, backbone.blocks.22.mlp.fc2.bias, backbone.blocks.22.ls2.gamma, backbone.blocks.23.norm1.weight, backbone.blocks.23.attn.qkv.weight, backbone.blocks.23.attn.proj.weight, backbone.blocks.23.attn.proj.bias, backbone.blocks.23.attn.q_norm.weight, backbone.blocks.23.attn.k_norm.weight, backbone.blocks.23.ls1.gamma, backbone.blocks.23.norm2.weight, backbone.blocks.23.mlp.fc1.weight, backbone.blocks.23.mlp.fc1.bias, backbone.blocks.23.mlp.fc2.weight, backbone.blocks.23.mlp.fc2.bias, backbone.blocks.23.ls2.gamma, backbone.blocks.24.norm1.weight, backbone.blocks.24.attn.qkv.weight, backbone.blocks.24.attn.proj.weight, backbone.blocks.24.attn.proj.bias, backbone.blocks.24.attn.q_norm.weight, backbone.blocks.24.attn.k_norm.weight, backbone.blocks.24.ls1.gamma, backbone.blocks.24.norm2.weight, backbone.blocks.24.mlp.fc1.weight, backbone.blocks.24.mlp.fc1.bias, backbone.blocks.24.mlp.fc2.weight, backbone.blocks.24.mlp.fc2.bias, backbone.blocks.24.ls2.gamma, backbone.blocks.25.norm1.weight, backbone.blocks.25.attn.qkv.weight, backbone.blocks.25.attn.proj.weight, backbone.blocks.25.attn.proj.bias, backbone.blocks.25.attn.q_norm.weight, backbone.blocks.25.attn.k_norm.weight, backbone.blocks.25.ls1.gamma, backbone.blocks.25.norm2.weight, backbone.blocks.25.mlp.fc1.weight, backbone.blocks.25.mlp.fc1.bias, backbone.blocks.25.mlp.fc2.weight, backbone.blocks.25.mlp.fc2.bias, backbone.blocks.25.ls2.gamma, backbone.blocks.26.norm1.weight, backbone.blocks.26.attn.qkv.weight, backbone.blocks.26.attn.proj.weight, backbone.blocks.26.attn.proj.bias, backbone.blocks.26.attn.q_norm.weight, backbone.blocks.26.attn.k_norm.weight, backbone.blocks.26.ls1.gamma, backbone.blocks.26.norm2.weight, backbone.blocks.26.mlp.fc1.weight, backbone.blocks.26.mlp.fc1.bias, backbone.blocks.26.mlp.fc2.weight, backbone.blocks.26.mlp.fc2.bias, backbone.blocks.26.ls2.gamma, backbone.blocks.27.norm1.weight, backbone.blocks.27.attn.qkv.weight, backbone.blocks.27.attn.proj.weight, backbone.blocks.27.attn.proj.bias, backbone.blocks.27.attn.q_norm.weight, backbone.blocks.27.attn.k_norm.weight, backbone.blocks.27.ls1.gamma, backbone.blocks.27.norm2.weight, backbone.blocks.27.mlp.fc1.weight, backbone.blocks.27.mlp.fc1.bias, backbone.blocks.27.mlp.fc2.weight, backbone.blocks.27.mlp.fc2.bias, backbone.blocks.27.ls2.gamma, backbone.blocks.28.norm1.weight, backbone.blocks.28.attn.qkv.weight, backbone.blocks.28.attn.proj.weight, backbone.blocks.28.attn.proj.bias, backbone.blocks.28.attn.q_norm.weight, backbone.blocks.28.attn.k_norm.weight, backbone.blocks.28.ls1.gamma, backbone.blocks.28.norm2.weight, backbone.blocks.28.mlp.fc1.weight, backbone.blocks.28.mlp.fc1.bias, backbone.blocks.28.mlp.fc2.weight, backbone.blocks.28.mlp.fc2.bias, backbone.blocks.28.ls2.gamma, backbone.blocks.29.norm1.weight, backbone.blocks.29.attn.qkv.weight, backbone.blocks.29.attn.proj.weight, backbone.blocks.29.attn.proj.bias, backbone.blocks.29.attn.q_norm.weight, backbone.blocks.29.attn.k_norm.weight, backbone.blocks.29.ls1.gamma, backbone.blocks.29.norm2.weight, backbone.blocks.29.mlp.fc1.weight, backbone.blocks.29.mlp.fc1.bias, backbone.blocks.29.mlp.fc2.weight, backbone.blocks.29.mlp.fc2.bias, backbone.blocks.29.ls2.gamma, backbone.blocks.30.norm1.weight, backbone.blocks.30.attn.qkv.weight, backbone.blocks.30.attn.proj.weight, backbone.blocks.30.attn.proj.bias, backbone.blocks.30.attn.q_norm.weight, backbone.blocks.30.attn.k_norm.weight, backbone.blocks.30.ls1.gamma, backbone.blocks.30.norm2.weight, backbone.blocks.30.mlp.fc1.weight, backbone.blocks.30.mlp.fc1.bias, backbone.blocks.30.mlp.fc2.weight, backbone.blocks.30.mlp.fc2.bias, backbone.blocks.30.ls2.gamma, backbone.blocks.31.norm1.weight, backbone.blocks.31.attn.qkv.weight, backbone.blocks.31.attn.proj.weight, backbone.blocks.31.attn.proj.bias, backbone.blocks.31.attn.q_norm.weight, backbone.blocks.31.attn.k_norm.weight, backbone.blocks.31.ls1.gamma, backbone.blocks.31.norm2.weight, backbone.blocks.31.mlp.fc1.weight, backbone.blocks.31.mlp.fc1.bias, backbone.blocks.31.mlp.fc2.weight, backbone.blocks.31.mlp.fc2.bias, backbone.blocks.31.ls2.gamma, backbone.blocks.32.norm1.weight, backbone.blocks.32.attn.qkv.weight, backbone.blocks.32.attn.proj.weight, backbone.blocks.32.attn.proj.bias, backbone.blocks.32.attn.q_norm.weight, backbone.blocks.32.attn.k_norm.weight, backbone.blocks.32.ls1.gamma, backbone.blocks.32.norm2.weight, backbone.blocks.32.mlp.fc1.weight, backbone.blocks.32.mlp.fc1.bias, backbone.blocks.32.mlp.fc2.weight, backbone.blocks.32.mlp.fc2.bias, backbone.blocks.32.ls2.gamma, backbone.blocks.33.norm1.weight, backbone.blocks.33.attn.qkv.weight, backbone.blocks.33.attn.proj.weight, backbone.blocks.33.attn.proj.bias, backbone.blocks.33.attn.q_norm.weight, backbone.blocks.33.attn.k_norm.weight, backbone.blocks.33.ls1.gamma, backbone.blocks.33.norm2.weight, backbone.blocks.33.mlp.fc1.weight, backbone.blocks.33.mlp.fc1.bias, backbone.blocks.33.mlp.fc2.weight, backbone.blocks.33.mlp.fc2.bias, backbone.blocks.33.ls2.gamma, backbone.blocks.34.norm1.weight, backbone.blocks.34.attn.qkv.weight, backbone.blocks.34.attn.proj.weight, backbone.blocks.34.attn.proj.bias, backbone.blocks.34.attn.q_norm.weight, backbone.blocks.34.attn.k_norm.weight, backbone.blocks.34.ls1.gamma, backbone.blocks.34.norm2.weight, backbone.blocks.34.mlp.fc1.weight, backbone.blocks.34.mlp.fc1.bias, backbone.blocks.34.mlp.fc2.weight, backbone.blocks.34.mlp.fc2.bias, backbone.blocks.34.ls2.gamma, backbone.blocks.35.norm1.weight, backbone.blocks.35.attn.qkv.weight, backbone.blocks.35.attn.proj.weight, backbone.blocks.35.attn.proj.bias, backbone.blocks.35.attn.q_norm.weight, backbone.blocks.35.attn.k_norm.weight, backbone.blocks.35.ls1.gamma, backbone.blocks.35.norm2.weight, backbone.blocks.35.mlp.fc1.weight, backbone.blocks.35.mlp.fc1.bias, backbone.blocks.35.mlp.fc2.weight, backbone.blocks.35.mlp.fc2.bias, backbone.blocks.35.ls2.gamma, backbone.blocks.36.norm1.weight, backbone.blocks.36.attn.qkv.weight, backbone.blocks.36.attn.proj.weight, backbone.blocks.36.attn.proj.bias, backbone.blocks.36.attn.q_norm.weight, backbone.blocks.36.attn.k_norm.weight, backbone.blocks.36.ls1.gamma, backbone.blocks.36.norm2.weight, backbone.blocks.36.mlp.fc1.weight, backbone.blocks.36.mlp.fc1.bias, backbone.blocks.36.mlp.fc2.weight, backbone.blocks.36.mlp.fc2.bias, backbone.blocks.36.ls2.gamma, backbone.blocks.37.norm1.weight, backbone.blocks.37.attn.qkv.weight, backbone.blocks.37.attn.proj.weight, backbone.blocks.37.attn.proj.bias, backbone.blocks.37.attn.q_norm.weight, backbone.blocks.37.attn.k_norm.weight, backbone.blocks.37.ls1.gamma, backbone.blocks.37.norm2.weight, backbone.blocks.37.mlp.fc1.weight, backbone.blocks.37.mlp.fc1.bias, backbone.blocks.37.mlp.fc2.weight, backbone.blocks.37.mlp.fc2.bias, backbone.blocks.37.ls2.gamma, backbone.blocks.38.norm1.weight, backbone.blocks.38.attn.qkv.weight, backbone.blocks.38.attn.proj.weight, backbone.blocks.38.attn.proj.bias, backbone.blocks.38.attn.q_norm.weight, backbone.blocks.38.attn.k_norm.weight, backbone.blocks.38.ls1.gamma, backbone.blocks.38.norm2.weight, backbone.blocks.38.mlp.fc1.weight, backbone.blocks.38.mlp.fc1.bias, backbone.blocks.38.mlp.fc2.weight, backbone.blocks.38.mlp.fc2.bias, backbone.blocks.38.ls2.gamma, backbone.blocks.39.norm1.weight, backbone.blocks.39.attn.qkv.weight, backbone.blocks.39.attn.proj.weight, backbone.blocks.39.attn.proj.bias, backbone.blocks.39.attn.q_norm.weight, backbone.blocks.39.attn.k_norm.weight, backbone.blocks.39.ls1.gamma, backbone.blocks.39.norm2.weight, backbone.blocks.39.mlp.fc1.weight, backbone.blocks.39.mlp.fc1.bias, backbone.blocks.39.mlp.fc2.weight, backbone.blocks.39.mlp.fc2.bias, backbone.blocks.39.ls2.gamma, backbone.blocks.40.norm1.weight, backbone.blocks.40.attn.qkv.weight, backbone.blocks.40.attn.proj.weight, backbone.blocks.40.attn.proj.bias, backbone.blocks.40.attn.q_norm.weight, backbone.blocks.40.attn.k_norm.weight, backbone.blocks.40.ls1.gamma, backbone.blocks.40.norm2.weight, backbone.blocks.40.mlp.fc1.weight, backbone.blocks.40.mlp.fc1.bias, backbone.blocks.40.mlp.fc2.weight, backbone.blocks.40.mlp.fc2.bias, backbone.blocks.40.ls2.gamma, backbone.blocks.41.norm1.weight, backbone.blocks.41.attn.qkv.weight, backbone.blocks.41.attn.proj.weight, backbone.blocks.41.attn.proj.bias, backbone.blocks.41.attn.q_norm.weight, backbone.blocks.41.attn.k_norm.weight, backbone.blocks.41.ls1.gamma, backbone.blocks.41.norm2.weight, backbone.blocks.41.mlp.fc1.weight, backbone.blocks.41.mlp.fc1.bias, backbone.blocks.41.mlp.fc2.weight, backbone.blocks.41.mlp.fc2.bias, backbone.blocks.41.ls2.gamma, backbone.blocks.42.norm1.weight, backbone.blocks.42.attn.qkv.weight, backbone.blocks.42.attn.proj.weight, backbone.blocks.42.attn.proj.bias, backbone.blocks.42.attn.q_norm.weight, backbone.blocks.42.attn.k_norm.weight, backbone.blocks.42.ls1.gamma, backbone.blocks.42.norm2.weight, backbone.blocks.42.mlp.fc1.weight, backbone.blocks.42.mlp.fc1.bias, backbone.blocks.42.mlp.fc2.weight, backbone.blocks.42.mlp.fc2.bias, backbone.blocks.42.ls2.gamma, backbone.blocks.43.norm1.weight, backbone.blocks.43.attn.qkv.weight, backbone.blocks.43.attn.proj.weight, backbone.blocks.43.attn.proj.bias, backbone.blocks.43.attn.q_norm.weight, backbone.blocks.43.attn.k_norm.weight, backbone.blocks.43.ls1.gamma, backbone.blocks.43.norm2.weight, backbone.blocks.43.mlp.fc1.weight, backbone.blocks.43.mlp.fc1.bias, backbone.blocks.43.mlp.fc2.weight, backbone.blocks.43.mlp.fc2.bias, backbone.blocks.43.ls2.gamma, backbone.blocks.44.norm1.weight, backbone.blocks.44.attn.qkv.weight, backbone.blocks.44.attn.proj.weight, backbone.blocks.44.attn.proj.bias, backbone.blocks.44.attn.q_norm.weight, backbone.blocks.44.attn.k_norm.weight, backbone.blocks.44.ls1.gamma, backbone.blocks.44.norm2.weight, backbone.blocks.44.mlp.fc1.weight, backbone.blocks.44.mlp.fc1.bias, backbone.blocks.44.mlp.fc2.weight, backbone.blocks.44.mlp.fc2.bias, backbone.blocks.44.ls2.gamma, backbone.blocks.45.norm1.weight, backbone.blocks.45.attn.qkv.weight, backbone.blocks.45.attn.proj.weight, backbone.blocks.45.attn.proj.bias, backbone.blocks.45.attn.q_norm.weight, backbone.blocks.45.attn.k_norm.weight, backbone.blocks.45.ls1.gamma, backbone.blocks.45.norm2.weight, backbone.blocks.45.mlp.fc1.weight, backbone.blocks.45.mlp.fc1.bias, backbone.blocks.45.mlp.fc2.weight, backbone.blocks.45.mlp.fc2.bias, backbone.blocks.45.ls2.gamma, backbone.blocks.46.norm1.weight, backbone.blocks.46.attn.qkv.weight, backbone.blocks.46.attn.proj.weight, backbone.blocks.46.attn.proj.bias, backbone.blocks.46.attn.q_norm.weight, backbone.blocks.46.attn.k_norm.weight, backbone.blocks.46.ls1.gamma, backbone.blocks.46.norm2.weight, backbone.blocks.46.mlp.fc1.weight, backbone.blocks.46.mlp.fc1.bias, backbone.blocks.46.mlp.fc2.weight, backbone.blocks.46.mlp.fc2.bias, backbone.blocks.46.ls2.gamma, backbone.blocks.47.norm1.weight, backbone.blocks.47.attn.qkv.weight, backbone.blocks.47.attn.proj.weight, backbone.blocks.47.attn.proj.bias, backbone.blocks.47.attn.q_norm.weight, backbone.blocks.47.attn.k_norm.weight, backbone.blocks.47.ls1.gamma, backbone.blocks.47.norm2.weight, backbone.blocks.47.mlp.fc1.weight, backbone.blocks.47.mlp.fc1.bias, backbone.blocks.47.mlp.fc2.weight, backbone.blocks.47.mlp.fc2.bias, backbone.blocks.47.ls2.gamma

"CLASSES" not found in meta, use dataset.CLASSES instead "PALETTE" not found in meta, use dataset.PALETTE instead /home/hello/PX/InternVL-MMDetSeg/mmsegmentation/tools/test.py:264: UserWarning: SyncBN is only supported with DDP. To be compatible with DP, we convert SyncBN to BN. Please use dist_train.sh which can avoid this error. warnings.warn( [ ] 0/2000, elapsed: 0s, ETA:/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/utils/checkpoint.py:25: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn("None of the inputs have requires_grad=True. Gradients will be None") Traceback (most recent call last): File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/tools/test.py", line 320, in main() File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/tools/test.py", line 273, in main results = single_gpu_test( File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/apis/test.py", line 91, in single_gpu_test result = model(return_loss=False, **data) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmcv/mmcv/parallel/data_parallel.py", line 51, in forward return super().forward(*inputs, **kwargs) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/nn/parallel/data_parallel.py", line 166, in forward return self.module(*inputs[0], **kwargs[0]) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmcv/mmcv/runner/fp16_utils.py", line 119, in new_func return old_func(*args, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/segmentors/base.py", line 110, in forward return self.forward_test(img, img_metas, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/segmentors/base.py", line 92, in forward_test return self.simple_test(imgs[0], img_metas[0], **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/segmentors/encoder_decoder.py", line 271, in simple_test seg_logit = self.inference(img, img_meta, rescale) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/segmentors/encoder_decoder.py", line 254, in inference seg_logit = self.slide_inference(img, img_meta, rescale) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/segmentors/encoder_decoder.py", line 188, in slide_inference crop_seg_logit = self.encode_decode(crop_img, img_meta) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/segmentors/encoder_decoder.py", line 82, in encode_decode x = self.extract_feat(img) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/segmentors/encoder_decoder.py", line 74, in extract_feat x = self.backbone(img) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/backbones/intern_vit_6b.py", line 402, in forward x = blk(x) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/backbones/intern_vit_6b.py", line 237, in forward return checkpoint.checkpoint(_inner_forward, x) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/utils/checkpoint.py", line 235, in checkpoint return CheckpointFunction.apply(function, preserve, *args) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/utils/checkpoint.py", line 96, in forward outputs = run_function(*args) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/backbones/intern_vit_6b.py", line 232, in _inner_forward x = x + self.drop_path1(self.ls1(self.attn(self.norm1(x)))) File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/backbones/intern_vit_6b.py", line 172, in forward x = self._naive_attn(x) if not self.use_flash_attn else self._flash_attn(x) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/backbones/intern_vit_6b.py", line 164, in _flash_attn context, _ = self.inner_attn( File "/home/hello/miniconda3/envs/internvl-mmdetseg/lib/python3.9/site-packages/torch/nn/modules/module.py", line 1130, in _call_impl return forward_call(*input, **kwargs) File "/home/hello/PX/InternVL-MMDetSeg/mmsegmentation/mmseg/models/backbones/flash_attention.py", line 40, in forward assert qkv.dtype in [torch.float16, torch.bfloat16] AssertionError

studiouspan avatar Jun 03 '25 03:06 studiouspan