kerasR icon indicating copy to clipboard operation
kerasR copied to clipboard

specifying multiple timesteps for an LSTM

Open taranraj123 opened this issue 8 years ago • 0 comments

The current method to setup an LSTM with multiple time-steps appears to be broken.

Let me being by saying that I'm working on a time-series forecasting problem and that my input training data has 346 training samples, with 82 timesteps per sample and 39 features for each timestep.

First I tried setting up the LSTM in a similar fashion to a working python example that I have,

> class(train_x)
[1] "array"
> dim(train_x)
[1] 346  82  39
> class(train_y) 
[1] "matrix"
> dim(train_y) # I've also tried setting up train_y as an array with dimensions (346, 86, 1) and get the same error message
[1] 346  82

> model <- Sequential()
> model$add(LSTM(25, input_shape = c(82, 39), return_sequences = TRUE))
> model$add(Dropout(0.2))
> model$add(LSTM(25, return_sequences = TRUE))
> model$add(Dropout(0.2))
> model$add(Dense(1))
> model$add(Activation('linear'))
> keras_compile(model, loss = 'mean_absolute_percentage_error', optimizer = 'rmsprop')
> history = keras_fit(model, train_x, train_y, epochs = 10, batch_size = 1, validation_split = 0.1, verbose = 2, shuffle = FALSE)

Error in py_call_impl(callable, dots$args, dots$keywords) : 
  Matrix type cannot be converted to python (only integer, numeric, complex, logical, and character matrixes can be converted

Since my original method didn't work I attempted to use a 2D matrix representation like is described in the documentation. But when I do this I get an error adding the LSTM layer to the model,

> dim(train_x)
[1]  346 3198
> class(train_x)
[1]  "matrix"
> length(train_y)
[1]  28372
> class(train_y)
[1]  "numeric"

> model <- Sequential()
> model$add(LSTM(25, input_shape = c(3198), return_sequences = TRUE))

Error in py_call_impl(callable, dots$args, dots$keywords) : 
  ValueError: Input 0 is incompatible with layer lstm_8: expected ndim=3, found ndim=2

Detailed traceback: 
  File "/usr/lib64/python2.7/site-packages/keras/models.py", line 464, in add
    layer(x)
  File "/usr/lib64/python2.7/site-packages/keras/layers/recurrent.py", line 482, in __call__
    return super(RNN, self).__call__(inputs, **kwargs)
  File "/usr/lib64/python2.7/site-packages/keras/engine/topology.py", line 559, in __call__
    self.assert_input_compatibility(inputs)
  File "/usr/lib64/python2.7/site-packages/keras/engine/topology.py", line 458, in assert_input_compatibility
    str(K.ndim(x)))

The vignette examples use an Embedding layer, and I'm not quite clear on how/why I would need to use one for my application. Thanks for your time & help!

taranraj123 avatar Dec 07 '17 21:12 taranraj123