teach icon indicating copy to clipboard operation
teach copied to clipboard

Different vocab size between `data.vocab` and `embs_ann`?

Open yingShen-ys opened this issue 3 years ago • 2 comments

Hi, I noticed that the data.vocab stored in the baseline model has a different vocabulary length compared to the language embedding stored in pretrained model.

For the baseline model "et_plus_h", the data.vocab file has Vocab(2554) for words while if I load the pretrained model from baseline_models/et_plus_h/latest.pth, the embedding layer model.embs_ann.lmdb_simbot_edh_vocab_none.weight has torch.Size([2788, 768]).

Did I miss something?

yingShen-ys avatar Jul 18 '22 19:07 yingShen-ys

Hi @yingShen-ys, any idea whether such discrepancy exists in the following cases:

I haven't previously examined the saved models in enough detail to notice a discrepancy like this so I'm not sure offhand on whether your intuition of expecting them to the same is correct, although it is plausible. I'll take a deeper look at the ET code and get back to you on this.

aishwaryap avatar Jul 25 '22 23:07 aishwaryap

Hi @yingShen-ys, any idea whether such discrepancy exists in the following cases:

I haven't previously examined the saved models in enough detail to notice a discrepancy like this so I'm not sure offhand on whether your intuition of expecting them to the same is correct, although it is plausible. I'll take a deeper look at the ET code and get back to you on this.

I am not training a new model but rather using the pretrained model from baseline_models downloaded using this repo.

The intuition is that model.embs_ann.lmdb_simbot_edh_vocab_none.weight is the weight of the word embedding layer and data.vocab stores the word vocabulary. So Vocab(2554) should be the word vocabulary size according to data.vocab, but if we check the word embedding layer in pretrained model, it seems like the pretrained model accepts a larger vocabulary size=2788 rather than 2554.

I think the pretrained model should have a corresponding data.vocab that is of size Vocab(2778) rather than 2554?

yingShen-ys avatar Jul 26 '22 12:07 yingShen-ys