AutoTimes icon indicating copy to clipboard operation
AutoTimes copied to clipboard

In-Context learning, Relevant Series and self.label_len

Open crkimm opened this issue 6 months ago • 1 comments

Hello, I’m very interested in your paper and algorithm — thank you for sharing the code.

I have a question regarding the In-Context Learning (Prompting Mechanism) and the Relevant Series described in the paper.

Based on the figure in the paper (attached image), I’m wondering whether the relevant series corresponds to the self.label_len variable in data_loader.py. Could you please clarify this point?

Image Image

If not, could you please explain the role of self.label_len in the code?

Thanks :)

crkimm avatar Jul 21 '25 08:07 crkimm

Thanks for your question. label_len here is used for next token prediction. Give n tokens from a training sample, the seq_len=n*token_len indicates the whole length of the training sample, which contains n-1 training signals of next tokens. Therefore, label_len means the input part of a sample. By shifting it with the token_len at n-1 position, we will get the groundtruth value as the optimization objective.

WenWeiTHU avatar Jul 22 '25 02:07 WenWeiTHU