Charles Kihn

Results 8 issues of Charles Kihn

ssv2. full_tuning --test_num_segment 2 \ --test_num_crop 3 \ to --test_num_segment 4 \ --test_num_crop 3 \ ``` InternVideo/InternVideo2/single_modality/models/internvideo2.py", line 527, in forward x = x + pos_embed RuntimeError: The size of...

oss2.exceptions.NoSuchKey: {'status': 404, 'x-oss-request-id': '664AB50BAB8D903237EA79DB', 'details': {'Code': 'NoSuchKey', 'Message': 'The specified key does not exist.', 'RequestId': '664AB50BAB8D903237EA79DB', 'HostId': 'dataset-hub.oss-cn-hangzhou.aliyuncs.com', 'Key': 'public-zip/modelscope/Youku-AliceMind/master/videos/pretrain/14111B121174Y8EbJ43JF-283-42187EFC3A83aBY1Ca55Y4a-Y27aY3Y8C1CCb4B7.mp4', 'EC': '0026-00000001', 'RecommendDoc': 'https://api.aliyun.com/troubleshoot?q=0026-00000001'}} oss2.exceptions.NoSuchKey: {'status': 404, 'x-oss-request-id': '664AB50BAB8D903237EA79DB',...

evaluate_activitynet_qa v_iKclcQEl4zI_10 ``` {'q': 'what is the safety factor of the flip', 'a': 'secondary', 'pred': 'The safety factor of the flip-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-backward-forward-back'} ```

Knowledge distillation: A good teacher is patient and consistent tensorflow: https://github.com/google-research/big_vision/tree/main/big_vision/configs/proj/distill Do you have plans to open source the distillation code?

```python def temporal_aggregation(self, image_features): T, N, D = image_features.shape ## D1: temporal cat (Just one line!) image_features = image_features.view(T * N, D) # [T*N D] ## D2: spatial pool +...

step1 pretrain_projector_image_encoder.sh step2 pretrain_projector_video_encoder.sh step3 finetune_dual_encoder.sh step4 eval/vcgbench/inference/run_ddp_inference.sh step5 eval/vcgbench/gpt_evaluation/vcgbench_evaluate.sh ``` #!/bin/sh export DATASET_DIR=/mnt2/ninghuayang/data/videogpt_plus_dataset BASE_LLM_PATH=microsoft/Phi-3-mini-4k-instruct VISION_TOWER=OpenGVLab/InternVideo2-Stage2_1B-224p-f4 IMAGE_VISION_TOWER=openai/clip-vit-large-patch14-336 PROJECTOR_TYPE=mlp2x_gelu #PRETRAIN_VIDEO_MLP_PATH=MBZUAI/VideoGPT-plus_Phi3-mini-4k_Pretrain/mlp2x_gelu_internvideo2/mm_projector.bin #PRETRAIN_IMAGE_MLP_PATH=MBZUAI/VideoGPT-plus_Phi3-mini-4k_Pretrain/mlp2x_gelu_clip_l14_336px/mm_projector.bin PRETRAIN_VIDEO_MLP_PATH=results/mlp2x_gelu_internvideo2/mm_projector.bin PRETRAIN_IMAGE_MLP_PATH=results/mlp2x_gelu_clip_l14_336px/mm_projector.bin OUTPUT_DIR_PATH=results/videogpt_plus_finetune deepspeed videogpt_plus/train/train.py \ --lora_enable True --lora_r 128...

[h264 @ 0x16543c00] Missing reference picture, default is 65562 [h264 @ 0x16543c00] mmco: unref short failure [h264 @ 0x16543c00] mmco: unref short failure [h264 @ 0x16543c00] Missing reference picture, default...

run_distill.py ```python from datasets import build_pretraining_dataset, build_multi_pretraining_dataset from engines.engine_for_distill import train_one_epoch from utils import NativeScalerWithGradNormCount as NativeScaler from utils import multiple_pretrain_samples_collate import utils from models import * ```