xr_a_y wrote: ↑Sun Dec 09, 2018 9:26 am
xr_a_y wrote: ↑Sun Dec 09, 2018 9:22 am
hgm wrote: ↑Sun Dec 09, 2018 7:48 am
There also is little need to do it, because Larry Kaufman already did it for us, and shared the results.
ok I get that but why Rofchade get pieces value like those then
Code: Select all
const eval pieceValue[2][6] =
{ 82, 337, 365, 477, 1025, 12000,
94, 281, 297, 512, 936, 12000 };
Why not use 100 as MG pawn value ?
What about this post ?
http://talkchess.com/forum3/viewtopic.p ... es#p778420
ok maybe because the pst used are not centered on zero ...
The values Larry Kaufmann gives are just a guideline that most engines were using in the past, all values are relative to each other, so it really doesn't matter what value you assign to a pawn to begin with.
In the past I used hand tuned values, and I started by using values like { 100, 325, 325, 500, 900 }, these are really the old school values, after tuning them with Texel style tuning, they look now like this:
Code: Select all
{ 85, 353, 352, 477, 913 },
{115, 285, 316, 556, 1185 };
Of course these values also depends upon the values of the PST and the values you assign to the positional evaluation, my PST's are scaled in such a way that the sum of all squares for one piece == 0. Personally I think that the whole concept of PST's is flawed, it is just a remnant of the past where we used to have lazy evaluation with material and just a few positional terms to gain a few percent in speed.
BTW. my k-factor currently is: 1.68, but that also doesn't matter much of course.
My material evaluation is still in centipawns, but for the positional evaluation I use millipawns because the resolution seemed a bit too low.
Over here gradient descent works fine, I don't see any problem with it, so if it goes weird in your case there must be a bug of some kind.