me
me
混淆词个数是在get_id2class.py中的limit实现的吗?
> 你好,有效果吗? hello , Do you have the image bboxs of the features of Flickr30k >
> > 好吧,atm 我没有具有这些功能的 npy 文件。尽管如此,在文献中,Flickr30K 中最常用的拆分是 Karpathy 拆分 ( https://cs.stanford.edu/people/karpathy/deepimagesent/ )。 > > 您可以从链接下载 anns 文件, > > 祝你好运 > > 我已经尝试过 Anderson 的咖啡功能,它很好。 I hope the bbox is...
> Hello, have you realized this function? Can I learn from your work? Thanks a lot! Hello, have you realized this function? Can I learn from your work? Thanks a...
> Also, i have two small questions. (a) I have tested custom images (MSCOCO 2014 train) on SGDet model, and i got two files: custom_data_info.json and custom_prediction.json. In custom_prediction.json, there...
> Can your work achieve a fixed extraction of 36 regions?
> > Great to know that. I have added the link to your project to the Readme file. > > BTW, I noticed that the attribute head is closed [here](https://github.com/MILVLG/bottom-up-attention.pytorch/blob/57e66ef2cc3732c2403aa78ef68579e57064c41e/configs/bua-caffe/extract-bua-caffe-r101.yaml#L11)....
: Some model parameters or buffers are not found in the checkpoint: backbone.res4.0.conv1.norm.num_batches_tracked backbone.res4.0.conv2.norm.num_batches_tracked backbone.res4.0.conv3.norm.num_batches_tracked backbone.res4.0.shortcut.norm.num_batches_tracked backbone.res4.1.conv1.norm.num_batches_tracked backbone.res4.1.conv2.norm.num_batches_tracked backbone.res4.1.conv3.norm.num_batches_tracked backbone.res4.10.conv1.norm.num_batches_tracked backbone.res4.10.conv2.norm.num_batches_tracked backbone.res4.10.conv3.norm.num_batches_tracked backbone.res4.11.conv1.norm.num_batches_tracked backbone.res4.11.conv2.norm.num_batches_tracked backbone.res4.11.conv3.norm.num_batches_tracked backbone.res4.12.conv1.norm.num_batches_tracked backbone.res4.12.conv2.norm.num_batches_tracked backbone.res4.12.conv3.norm.num_batches_tracked backbone.res4.13.conv1.norm.num_batches_tracked backbone.res4.13.conv2.norm.num_batches_tracked...
> No, I checked the code and never found out why, as long as the author knows, so I am ready to use the original bottom-up-attention, which works very well.
V:17320069772 ---Original--- From: ***@***.***> Date: Sun, Dec 11, 2022 16:34 PM To: ***@***.***>; Cc: ***@***.******@***.***>; Subject: Re: [MILVLG/bottom-up-attention.pytorch] test-caffe-r101-fix36.yaml (Issue #99) Can you give me a contact information if it...