Autoformer icon indicating copy to clipboard operation
Autoformer copied to clipboard

ValueError: lags cannot go further than history length, found lag 24 while history length is only 72

Open Suraez opened this issue 1 year ago • 0 comments

Hi,

I am using PyTorch to use Autoformer model API. I have been stuck at this error

ValueError: lags cannot go further than history length, found lag 24 while history length is only 72

my data structure looks like this:

Image

Input features: cpu and memory Target Feature: requests Data Granularity: Minutes Total size of data I am training Autoformer on = 24 hours x 60 minutes = 1440 minutes

this is my custom DatasetLoader


class TimeSeriesWindowDataset(Dataset):
    def __init__(self, df, context_length, prediction_length):
        self.context_length = context_length
        self.prediction_length = prediction_length

        self.values = df['requests'].values
        self.features = df[['memory', 'cpu']].values

        self.length = len(df) - context_length - prediction_length + 1

        
        self.static_real = np.array([df['cpu'].mean(), df['memory'].mean()])
        self.static_cat = np.array([0])  
    def __len__(self):
        return self.length

    def __getitem__(self, idx):
        # Context (past)
        past_values = self.values[idx : idx + self.context_length]
        past_time_features = self.features[idx : idx + self.context_length]

        # Prediction (future)
        future_values = self.values[
            idx + self.context_length : idx + self.context_length + self.prediction_length
        ]
        future_time_features = self.features[
            idx + self.context_length : idx + self.context_length + self.prediction_length
        ]

        # Observed masks
        past_observed_mask = ~np.isnan(past_values)
        future_observed_mask = ~np.isnan(future_values)

        return {
            'past_values': torch.tensor(past_values, dtype=torch.float),
            'past_time_features': torch.tensor(past_time_features, dtype=torch.float),
            'past_observed_mask': torch.tensor(past_observed_mask, dtype=torch.float),
            'future_values': torch.tensor(future_values, dtype=torch.float),
            'future_time_features': torch.tensor(future_time_features, dtype=torch.float),
            'future_observed_mask': torch.tensor(future_observed_mask, dtype=torch.float),
            'static_real_features': torch.tensor(self.static_real, dtype=torch.float),
            'static_categorical_features': torch.tensor(self.static_cat, dtype=torch.long),
        }

Then I am using DataLoader from the torch.utils.data to create batches of data , the code is

loader = DataLoader(dataset, batch_size=64, shuffle=True)
for batch in loader:
    print("past values: ", batch['past_values'].shape)  
    print("past_time_features: ",batch['past_time_features'].shape) 
    print("past_observed_mask: ",batch['past_observed_mask'].shape)
    print("future_values: ",batch['future_values'].shape) 
    print("future_time_series: ",batch['future_time_features'].shape)
    break

and the config for the Autoformer model is shown below

config = AutoformerConfig(context_length=24, prediction_length=48, lags_sequence=[1, 2, 3, 4, 5, 6, 7, 11, 12, 13, 23, 24])
model = AutoformerModel(config)

And when i do

output = model.forward(
    past_values=batch['past_values'],
    past_time_features=batch['past_time_features'],
    past_observed_mask=batch['past_observed_mask'],
    future_values=batch['future_values'],
    future_time_features=batch['future_time_features'],
)

I got the error as stated in the title of the issue. Things I have tried are followings:

  1. Tried different combination of lags and history length ( lags > history length, lags = history length, lags < history length)
  2. made context_length and prediction_length equal in value

These are all the imports

from transformers import AutoformerConfig, AutoformerModel
import pandas as pd
import numpy as np
import torch
from torch.utils.data import Dataset, DataLoader

Please let me know if you need any additional information. If somebody could help me with this problem, I would highly appreciate it. :)

Suraez avatar Apr 03 '25 00:04 Suraez