is cross-validation 'valid' after a hyperparameter search has been done? #576
-
Hi, I want to do kFold cross-validation of my NN model - only for my NN model and not to compare to other ML methods. Is it valid in theory to do a random_search with keras_tuner and then do kfold-cross validation separelty with the optimal parameters (not necessarily with keras tuner but just generally speaking)? I don't see why the two would have to be incorporated unless there are hyperparamaters that are relatively close - both give good model performance, but one might give a better kfold cross validation test predictions. thank you |
Beta Was this translation helpful? Give feedback.
Replies: 2 comments 2 replies
-
To do this, you will have to override the |
Beta Was this translation helpful? Give feedback.
-
Hi @haifeng-jin, thank you - Suppose i don't want to do a 10x cross validation for every hyperparamater but only for the first model - is it a valid approach to take the best hyperparameters and then do a 10x cross validation to measure out of sample performance across the folds? thank you |
Beta Was this translation helpful? Give feedback.
To do this, you will have to override the
Tuner.run_trial
function to do the kfold yourself.It is currently not supported in the codebase.