How do you do hyperparameter searches in ML?

post by lsusr · 2020-01-13T03:45:46.837Z · LW · GW · 1 comment

This is a question post.

Contents

  Answers
    5 Jader Martins
None
1 comment

I know how to do hyperparameter searches. ☺

This is a survey. I want to know is how you do hyperparameter searches. It doesn't matter whether your system is good or bad. I won't judge you. I just want to know what systems other people are using in the real world right now.

Any information you're willing to share would help me out here, but there are two questions I'm especially interested in.

  1. What algorithm do you use? (Do you use random search, grid search or Bayes search? Do you do some iterative process? Do you do something else entirely?)
  2. Do you cache anything? If so, what's your process?

I'm also curious what industry you're in, but if you're not comfortable sharing that some information is better than none.

Answers

answer by Jader Martins · 2020-01-14T01:55:19.404Z · LW(p) · GW(p)

Usually the RandomSearchCV, I've tryied the bayesian optimization from skopt, but empirically I did not see advantages in this over random search, not sure if I used it wrong, someone with good results?

comment by lsusr · 2020-01-13T23:31:54.127Z · LW(p) · GW(p)

I think it depends on your problem. If you have lots of compute power, high dimensionality and powerful higher-order emergent behavior from hyperparameters then Bayesian optimization makes sense. And vice-versa.

1 comment

Comments sorted by top scores.

comment by Donald Hobson (donald-hobson) · 2020-11-28T14:59:58.452Z · LW(p) · GW(p)

I manually tweak the hyper-parameters until it seems to work. (That said, the ML systems are being trained on toy problems, and I don't care about squeezing out every drop of performance.)