Jaeik Kim

Results 5 comments of Jaeik Kim

Thank you for the prompt response. It seems to be working, but due to resource constraints, I used a batch size of 8 and a learning rate of 4e-5. (I’m...

Thank you for the detailed explanation. I'll refer to the code you provided and run more experiments. Thank you!

Hello, ```python for epoch in range(epochs): losses = 0. adamerging_mtl_model.loading_weights() for dataset_name in exam_datasets: dataset = get_dataset(dataset_name, pretrained_model.val_preprocess, location=args.data_location, batch_size=16) dataloader = get_dataloader_shuffle(dataset) for i, data in enumerate(tqdm.tqdm(dataloader)): data =...

For additional information for @EnnengYang I am using an A6000 GPU with 48GB of memory, and I encounter a memory issue when processing the 4th dataset out of 8 datasets...

For additional information for @EnnengYang and @kasurashan . I successfully reproduced the results, achieving an average score of 91.0 in layerwise++ / ViT-L-14 with a batch size of 16 and...