MLHyperparameterTuning icon indicating copy to clipboard operation
MLHyperparameterTuning copied to clipboard

Add control of iteration used in testing

Open marabout2015 opened this issue 6 years ago • 0 comments

Currently, the testing script uses the maximum number of iterations of the trained model to score the data. Add an "early_stopping_rounds" argument to the training script so it records the best iteration on validation data found, and an argument to the testing script that controls whether that is the iteration used in scoring.

marabout2015 avatar Oct 15 '19 12:10 marabout2015