PyTorch-Encoding icon indicating copy to clipboard operation
PyTorch-Encoding copied to clipboard

question about encoding layer(dictionary learning)

Open hhyhcr449 opened this issue 6 years ago • 3 comments

Great job! But I have a question about the codebook .In my understanding, dictionary learning is equal to sparse encoding, but why is the size of the codebook is (32,128)?thank you very much!

hhyhcr449 avatar Jan 05 '20 05:01 hhyhcr449

32 and 128 are the number and dimension of codebook respectively.

qiulesun avatar Jan 05 '20 09:01 qiulesun

32 and 128 are the number and dimension of codebook respectively.

yes. In my understanding,dictionary learning is a branch of signal processing and machine learning that aims at finding a frame (called dictionary) in which some training data admits a sparse representation.

What I want to know is can this codebook guarantee sparse representation? thank you!

hhyhcr449 avatar Jan 05 '20 14:01 hhyhcr449

There are lots of coding methods based on codebook, such as VLAD and sparse coding. Encoding in Hang Zhang's CVPR2018 paper is a VLAD-like coding method. You can recall the VLAD coding or NetVLAD to better understanding encoding.

qiulesun avatar Jan 06 '20 01:01 qiulesun