evaluation
您好,我在运行测试脚本bash scripts/train train_vg1.2.sh后,得到权重文件output/train/vg1.2_5e/20250823190/checkpoint_0.pth,然后运行测试脚本bash scripts/eval/eval_vg1.2_densecap.sh output/train/vg1.2_5e/20250823190/checkpoint_0.pth输出了如下的结果:
2025-08-24 18:01:25,810 [INFO] load checkpoint from /root/autodl-tmp/ControlCap/ckpts/checkpoint_0.pth
2025-08-24 18:01:25,824 [INFO] Evaluating on val.
2025-08-24 18:01:25,825 [INFO] dataset_ratios not specified, datasets will be concatenated (map-style datasets) or chained (webdataset.DataPipeline).
2025-08-24 18:01:25,825 [INFO] Loaded 3682892 records for train split from the dataset.
2025-08-24 18:01:25,825 [INFO] Loaded 5000 records for val split from the dataset.
2025-08-24 18:01:25,825 [INFO] Empty train splits.
2025-08-24 18:01:25,825 [INFO] Empty train splits.
Loading checkpoint shards: 100%|███████████████████████████████████████████████| 2/2 [02:38<00:00, 79.22s/it]
Evaluation [ 0/2500] eta: 4:42:57 time: 6.7910 data: 2.9157 max mem: 17443
Evaluation [ 10/2500] eta: 1:18:03 time: 1.8810 data: 0.2698 max mem: 18689
Evaluation [2490/2500] eta: 0:00:14 time: 1.4257 data: 0.0039 max mem: 21590
Evaluation [2499/2500] eta: 0:00:01 time: 1.5855 data: 0.0568 max mem: 21590
Evaluation Total time: 1:00:39 (1.4558 s / it)
2025-08-24 19:03:36,920 [WARNING] Merging results.
100%|██████████████████████████████████████████████████████████████████████████| 2/2 [00:00<00:00, 2.86it/s]
2025-08-24 19:04:04,962 [INFO] :Get (0/237996) predictions.
2025-08-24 19:04:04,963 [INFO] :Save result to (/root/autodl-tmp/ControlCap/output/eval/vg1.2/20250824175/result/val.json).
2025-08-24 19:04:05,699 [INFO] :Begin evaluation (/root/autodl-tmp/ControlCap/output/eval/vg1.2/20250824175/result/val.json).
loading annotations into memory...
Done (t=3.53s)
creating index...
index created!
loading annotations into memory...
Done (t=3.53s)
creating index...
index created!
100%|██████████████████████████████████████████████████████████████████| 5000/5000 [00:04<00:00, 1001.58it/s]
100%|███████████████████████████████████████████████████████████████████| 5000/5000 [00:24<00:00, 202.84it/s]
huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks...
To disable this warning, you can either:
- Avoid using tokenizers before the fork if possible
- Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false)
Traceback (most recent call last):
File "train.py", line 98, in
你好!我没有想到比较好的解决办法。似乎这个evaluation package依赖于java