question about encoding layer(dictionary learning)
Great job! But I have a question about the codebook .In my understanding, dictionary learning is equal to sparse encoding, but why is the size of the codebook is (32,128)?thank you very much!
32 and 128 are the number and dimension of codebook respectively.
32 and 128 are the number and dimension of codebook respectively.
yes. In my understanding,dictionary learning is a branch of signal processing and machine learning that aims at finding a frame (called dictionary) in which some training data admits a sparse representation.
What I want to know is can this codebook guarantee sparse representation? thank you!
There are lots of coding methods based on codebook, such as VLAD and sparse coding. Encoding in Hang Zhang's CVPR2018 paper is a VLAD-like coding method. You can recall the VLAD coding or NetVLAD to better understanding encoding.