Oh awesome stuff. Never really managed to create my own executable so ran it through the Qt Creator interface.
Thank you very much.
CLOP for Noisy Black-Box Parameter Optimization
Moderator: Ras
-
- Posts: 62
- Joined: Mon Oct 03, 2011 9:40 pm
-
- Posts: 750
- Joined: Mon Mar 27, 2006 7:45 pm
- Location: Finland
Re: CLOP for Noisy Black-Box Parameter Optimization
Big thanks. A couple of problems with compiling on Ubuntu 11.10 though:Rémi Coulom wrote:I uploaded a new version. I included a Windows version this time.
- GCC says: "rclib/src/util/userflag.h:27:9: error: ‘size_t’ does not name a type". I fixed this by inserting "#include <cstring>" at the beginning of the file.
- In programs/_general/options.mk only the Python 2.5 and 2.6 include paths are used. I had to add 2.7 to make it compile.
-
- Posts: 62
- Joined: Mon Oct 03, 2011 9:40 pm
Re: CLOP for Noisy Black-Box Parameter Optimization
Had an interesting piece of "anomaly" happening. Not sure if this is a CLOP thing or general tuning (miss)behaviour.
I used one parameter to tune a number which was then used by a bunch more parameters. I.e. something like:
p1 * a
p2 * p1 * b
p3 * p1 * c
p4 * p1 * d
etc.
CLOP rushed p1 to very near 0 (in a few hundred games it was firmly set on close to miniscule numbers for p1 and didn't change for the next 20,000 games), and then tried to give huge numbers to p2, p3 etc.
Removing p1 (and setting it to some arbitrary positive constant) gave much more sensible numbers for the rest of the parameters.
I guess the end result was about the same. Maybe my use of p1 was so diverse that keeping it at close to insignificant was the best way to go...
Just thinking aloud here, but this tuning business is really fascinating to me.
I used one parameter to tune a number which was then used by a bunch more parameters. I.e. something like:
p1 * a
p2 * p1 * b
p3 * p1 * c
p4 * p1 * d
etc.
CLOP rushed p1 to very near 0 (in a few hundred games it was firmly set on close to miniscule numbers for p1 and didn't change for the next 20,000 games), and then tried to give huge numbers to p2, p3 etc.
Removing p1 (and setting it to some arbitrary positive constant) gave much more sensible numbers for the rest of the parameters.
I guess the end result was about the same. Maybe my use of p1 was so diverse that keeping it at close to insignificant was the best way to go...
Just thinking aloud here, but this tuning business is really fascinating to me.

-
- Posts: 438
- Joined: Mon Apr 24, 2006 8:06 pm
Re: CLOP for Noisy Black-Box Parameter Optimization
Note that if you are going to optimize positive parameters that multiply or divide each other, it might be a good idea to declare them as "GammaParameter", ie perform regression on their logarithm. The performance is likely to be more quadratic then.Zlaire wrote:Had an interesting piece of "anomaly" happening. Not sure if this is a CLOP thing or general tuning (miss)behaviour.
I used one parameter to tune a number which was then used by a bunch more parameters. I.e. something like:
p1 * a
p2 * p1 * b
p3 * p1 * c
p4 * p1 * d
etc.
It might be a good idea to declare at least p1 as GammaParameter here. Make sure the minimum is > 0 (0.00001, for instance).
It might be an even better idea to not multiply p2, p3, and p4 by p1.
Having two parameters that multiply each other is likely to make the function Rosenbrock-ish, which is difficult to optimize with quadratic regression.
Rémi
-
- Posts: 2684
- Joined: Sat Jun 14, 2008 9:17 pm
Re: CLOP for Noisy Black-Box Parameter Optimization
I thought about this statement for some time, and I would like to ask, if de-correlation is a good thing, why don't you de-correlate the variables already by yourself inside the algorithm ?Rémi Coulom wrote: Also: when using "Correlations none", you can try to de-correlate your variables. For instance, values of Knight(N) and Bishop(B) are strongly correlated. Instead of optimizing N and B, you can optimize N+B and N-B: they are almost independent.
I mean, user asks you to optimize p1, p2. Instead of "playing" with p1 and p2 directly, you internally try to tune 2 other variables c1 and c2 that are bounded to p1, p2 by a de-correlation function:
(p1, p2) = D(c1, c2)
Of course this is hidden to the user that just sees the final values (p1, p2), but inside CLOP all the tuning is performed on (c1, c2).
What do you think ?
-
- Posts: 438
- Joined: Mon Apr 24, 2006 8:06 pm
Re: CLOP for Noisy Black-Box Parameter Optimization
This is what CLOP does by default (with "Correlations all"). The problem is that the number of pairs of variables grows like the square of the number of variables. So it is costly when there are many variables.mcostalba wrote:I thought about this statement for some time, and I would like to ask, if de-correlation is a good thing, why don't you de-correlate the variables already by yourself inside the algorithm ?Rémi Coulom wrote: Also: when using "Correlations none", you can try to de-correlate your variables. For instance, values of Knight(N) and Bishop(B) are strongly correlated. Instead of optimizing N and B, you can optimize N+B and N-B: they are almost independent.
I mean, user asks you to optimize p1, p2. Instead of "playing" with p1 and p2 directly, you internally try to tune 2 other variables c1 and c2 that are bounded to p1, p2 by a de-correlation function:
(p1, p2) = D(c1, c2)
Of course this is hidden to the user that just sees the final values (p1, p2), but inside CLOP all the tuning is performed on (c1, c2).
What do you think ?
Rémi
-
- Posts: 2684
- Joined: Sat Jun 14, 2008 9:17 pm
Re: CLOP for Noisy Black-Box Parameter Optimization
Thanks for your quick answer to previous question. Here is the next oneRémi Coulom wrote: More generally, you can try to be creative to reduce the dimensionality of the optimization problem.

Why don't you reduce the dimensionality by yourself ?
I mean user asks to tune p1,....,p8 and then he sets also a new parameter that is "dimensionality": given dimensionality = 2 you tune the two derived values c1 and c2 that are derived from p1,..p8 for instance (but here you are much more creative than me) by a linear combination of p1...p8.
If you remember the ampli+bias idea that Joona and me reported, this is a kind of generalization of that idea.
What do you think ?
P.S: After many months tuning SF I made up my mind that the secret of a good tuning is the choice of the starting variables to tune. So a mapping of P1,...,Pn to C1,...,Ck and tune Ck could yield, if done properly, a much faster and better tune.
-
- Posts: 62
- Joined: Mon Oct 03, 2011 9:40 pm
Re: CLOP for Noisy Black-Box Parameter Optimization
Another question. How does a parameter that can't be tuned (let's say it doesn't effect the outcome of the game at all), interfere with other parameters in the same suite?
Would removing that ineffective parameter improve the result or is it disregarded anyway?
Would removing that ineffective parameter improve the result or is it disregarded anyway?
-
- Posts: 722
- Joined: Mon Apr 19, 2010 7:07 pm
- Location: Sweden
- Full name: Peter Osterlund
Re: CLOP for Noisy Black-Box Parameter Optimization
It seems to me that you would get a similar effect if you let your CLOP parameters represent deltas from your current parameter values, and then project the solution vector onto the subspace corresponding to the k smallest eigenvalues of the hessian.mcostalba wrote:Thanks for your quick answer to previous question. Here is the next oneRémi Coulom wrote: More generally, you can try to be creative to reduce the dimensionality of the optimization problem.
Why don't you reduce the dimensionality by yourself ?
I mean user asks to tune p1,....,p8 and then he sets also a new parameter that is "dimensionality": given dimensionality = 2 you tune the two derived values c1 and c2 that are derived from p1,..p8 for instance (but here you are much more creative than me) by a linear combination of p1...p8.
-
- Posts: 2684
- Joined: Sat Jun 14, 2008 9:17 pm
Re: CLOP for Noisy Black-Box Parameter Optimization
My CLOP parameters already represent deltas becuase, to get an uniform tuning interface, I have defined in the clop file:petero2 wrote: It seems to me that you would get a similar effect if you let your CLOP parameters represent deltas from your current parameter values, and then project the solution vector onto the subspace corresponding to the k smallest eigenvalues of the hessian.
Code: Select all
parameter p0 -100 100
parameter p1 -100 100
parameter p2 -100 100
......
Regarding the second part of your sentence I have not understand what "in practice" I should do.