How to setLogPriors for Naive Bayes model during cross validation?
I am using the Cross Validation to estimate the performance for my model, right now the way I am using it is ClassificationMetrics vm = new Validator<>(ClassificationMetrics.class, configuration).validate(new KFoldSplitter(10).split(trainingDataframe), new MultinomialNaiveBayes.TrainingParameters());
in the com.datumbox.framework.core.machinelearning.common.abstracts.algorithms.AbstractNaiveBayes, I see there's a setLogPriors function which can probably be used to tune the model. (I want to create a DET graph for the model performance, by playing around with the prior probability). Is there a way to set the prior probability of different labels for cross validation? Thanks.
Not at the moment. Everything inside the ModelParameters classes is estimated during training period. You can indirectly affect the prior probabilities by resampling the dataset but you can't directly set them. This is a limitation that can be addressed on upcoming versions.