Dense layer name incorrectly assigned as "GRU"
Hi, I'm working with hyperas and find something wired. I have Dense layer, but hyperas file incorrectly assigned a "GRU" name for it. Below is the code extracted from the temp_model.py file.
model = Sequential() model.add(GRU(space['GRU'])) model.add(Dense(space['GRU_1'])) model.add(Activation(space['Activation'])) model.add(Dropout(space['Dropout'])) model.add(Dense(space['GRU_2'], name='last_dense')) model.add(Activation(space['Activation_1'])) model.add(Dropout(space['Dropout_1'])) model.add(Dense(2)) model.add(Activation('softmax')) print(model.summary()) parallel_model = multi_gpu_model(model, gpus=2) parallel_model.compile(loss='categorical_crossentropy', metrics=['accuracy', auc, binary_FP, binary_FN], optimizer=space['optimizer']) parallel_model.fit_generator(BatchGenerator(XX_train, fre_train, YY_train, batch_size=bs), steps_per_epoch=steps, epochs=1, use_multiprocessing=True) score, acc, Auc, fp, fn = parallel_model.evaluate_generator(BatchGenerator(XX_test, fre_test, YY_test, batch_size=1), verbose=1) err = fp + fn return {'loss': err, 'status': STATUS_OK, 'model': parallel_model}
def get_space(): return { 'GRU': hp.choice('GRU', [64,128,256]), 'GRU_1': hp.choice('GRU_1', [64,128,256]), 'Activation': hp.choice('Activation', ['tanh','selu','sigmoid']), 'Dropout': hp.uniform('Dropout', 0,1), 'GRU_2': hp.choice('GRU_2', [64,128,256]), 'Activation_1': hp.choice('Activation_1', ['tanh','selu','sigmoid']), 'Dropout_1': hp.uniform('Dropout_1', 0,1), 'optimizer': hp.choice('optimizer', ['rmsprop','adam','adagrad','adadelta']), }
I wonder if this is a bug or anything. Thanks.
send your full example as gist please!
This happens to me as well
https://gist.github.com/romanovzky/a7fd3a7ede09c8c81b06bc5965c34402
The best model:
{'GRU': 2,
'GRU_1': 1,
'epochs': 1,
'recurrent_dropout': 0.7371698374615214,
'recurrent_dropout_1': 0.6517968154887782,
'recurrent_dropout_2': 0.4371162594318422}
You can see that both the layer name and attribute name comes wrong, as there are two dropouts and only one recurrent dropout.