Yes, but I don't really believe in it, not even for WDL. I think by far the best predictor is the material balance combined with a low-depth search to weed out immediate tactics (*). The material-balance factor will be taken care of by the compression algorithm (if white wins almost all positions, then all those very long strings of white wins will compress to almost nothing). Immediate tactics can be recomputed cheaply during probing (compared to running an NN).AlvaroBegue wrote:There are ways in which a NN can help in compressing EGTBs, without involving magic. For instance, try to train the NN to predict WDL. Then store in a table the list of positions that are misclassified by the NN. If the NN does a decent job, this could result in a much compressed description of the EGTB.syzygy wrote:For best TB compression, just throw away as much information as you are willing to recompute during probing, then compress what is left with a suitable compressor. There is no way that an NN could "learn" that data more efficiently. An NN does not involve magic.
But even if an NN helped (e.g. to find some structure in the data after removing the immediate tactics), you would need to run it on every TB position, which is simply too expensive (and then I ignore the time needed for training the NN). (The low-depth searches are much cheaper and don't even have to be run before compression if the TB generator somehow marks the positions affected by low-depth tactics during generaton.)
Feel free to prove me wrong on 5-piece tables.

(*) There are some other useful predictors like pawn advancement and bishop color. They can be dealt with by picking a good order of the pieces when calculating the index or dividing the table in separately compressible parts.