Isn't the NNUE board representation incomplete? It just states where pieces are, completely ignoring attack information. I know that it works, I have built my own, on top of a new engine in JAVA, and it amazes me. Yes i know the dilemma, sparse inputs mean speedy incremental updates. Not having that kills nps. But still I wonder, if a half board representation is alreay that good, what is the potential of a richer more complete one? What about 2x768, the first 768 representing square contains piece, the second 768 square attacked by piece? Forget incremental updates (because nots sparse anymore), build inputs from scratch, with AVX. It would be slow, but it would not have crippled inputs. I'm probably going to try this at some point, seems like a nice experiment. Any thoughts?
Bas
NNUE incomplete?
Moderator: Ras
-
ZirconiumX
- Posts: 1362
- Joined: Sun Jul 17, 2011 11:14 am
- Full name: Hannah Ravensloft
Re: NNUE incomplete?
The term used for this is "threat inputs", to help your googling.
tu ne cede malis, sed contra audentior ito
-
phhnguyen
- Posts: 1542
- Joined: Wed Apr 21, 2010 4:58 am
- Location: Australia
- Full name: Nguyen Hong Pham
Re: NNUE incomplete?
I’m a NNUE document reader, but have yet to implement any NNUE. Firstly, I am confused about what you mentioned: "a half board representation", especially since you gave the number 768. 768 = 6 pieces x 2 colors x 64 squares. It is full board but not half. In HalfKP, we have 2 halves for white and black Kings, combined into a full board too.
On one hand, I agree that the NNUE board representation is incomplete. However, IMHO, the attack information, as you mentioned, as well as other information such as pin, threat… can actually be inferred from the locations of chess pieces. In other words, they are redundant in terms of encoding. Perhaps the NN itself can extract that information (the attack information) when being trained.
On the other hand, in the view of encoding, the input of NNUE is missing the following information:
* the side to move
* rule 50 move count
* en-passant
* castling rights
Even though missing the above information, plus applying incorrect board rotation (it should be a flip), the first architect, HalfKP of SF, can work well. It is really a big surprise!
On one hand, I agree that the NNUE board representation is incomplete. However, IMHO, the attack information, as you mentioned, as well as other information such as pin, threat… can actually be inferred from the locations of chess pieces. In other words, they are redundant in terms of encoding. Perhaps the NN itself can extract that information (the attack information) when being trained.
On the other hand, in the view of encoding, the input of NNUE is missing the following information:
* the side to move
* rule 50 move count
* en-passant
* castling rights
Even though missing the above information, plus applying incorrect board rotation (it should be a flip), the first architect, HalfKP of SF, can work well. It is really a big surprise!
https://banksiagui.com
The most features chess GUI, based on opensource Banksia - the chess tournament manager
The most features chess GUI, based on opensource Banksia - the chess tournament manager
-
Bas
- Posts: 12
- Joined: Mon Jan 12, 2026 6:44 pm
- Full name: Bas Hamstra
Re: NNUE incomplete?
perhaps, yes, perhaps it can derive mobility (for example) from piece locations. I would think that's impossible, but I do see it learns piece activity quite well. Weird, I would have thought it was a big hurdle to learn that, from just piece locations. Still I'm curious what attack info (ok, threat inputs) would bring to the table. There are people trying to substitute search at least partly by neural nets, apparently not without success. A superslow supersmart net seems like a very fun experiment to me, even if it's not going to be elo king.
Bas
Bas
phhnguyen wrote: ↑Sat Mar 07, 2026 12:06 pm I’m a NNUE document reader, but have yet to implement any NNUE. Firstly, I am confused about what you mentioned: "a half board representation", especially since you gave the number 768. 768 = 6 pieces x 2 colors x 64 squares. It is full board but not half. In HalfKP, we have 2 halves for white and black Kings, combined into a full board too.
On one hand, I agree that the NNUE board representation is incomplete. However, IMHO, the attack information, as you mentioned, as well as other information such as pin, threat… can actually be inferred from the locations of chess pieces. In other words, they are redundant in terms of encoding. Perhaps the NN itself can extract that information (the attack information) when being trained.
On the other hand, in the view of encoding, the input of NNUE is missing the following information:
* the side to move
* rule 50 move count
* en-passant
* castling rights
Even though missing the above information, plus applying incorrect board rotation (it should be a flip), the first architect, HalfKP of SF, can work well. It is really a big surprise!
-
syzygy
- Posts: 5948
- Joined: Tue Feb 28, 2012 11:56 pm
-
Steve Maughan
- Posts: 1321
- Joined: Wed Mar 08, 2006 8:28 pm
- Location: Florida, USA
Re: NNUE incomplete?
Maybe a compromise is to have squares attacked by a pawn as inputs to the net. This would only add 2 x 48 new inputs (6 rows of 8). It would only be updated when pawns move, which could be efficiently updated incrementally.
The again, pawn attacks implicit in the location of a pawn i.e. a white pawn on e4 is always attacking d5 and f5. So maybe this won't add value. Maybe the attacks should be the sliding pieces.
— Steve
The again, pawn attacks implicit in the location of a pawn i.e. a white pawn on e4 is always attacking d5 and f5. So maybe this won't add value. Maybe the attacks should be the sliding pieces.
— Steve
http://www.chessprogramming.net - Juggernaut & Maverick Chess Engine
-
Bas
- Posts: 12
- Joined: Mon Jan 12, 2026 6:44 pm
- Full name: Bas Hamstra
Re: NNUE incomplete?
Yes...another compromise is attacks to the 8 squares around the kings. But I am probably going to try a side experiment with just everything, 2x 768 inputs. Incremental updates is probably not doable so it will be ultraslow but AVX wins a bit back. I am just curious about the potential, and would it play different? How slow would it be, and how much of the speed loss is compensated? It's interesting...Steve Maughan wrote: ↑Mon Mar 09, 2026 1:58 pm Maybe a compromise is to have squares attacked by a pawn as inputs to the net. This would only add 2 x 48 new inputs (6 rows of 8). It would only be updated when pawns move, which could be efficiently updated incrementally.
The again, pawn attacks implicit in the location of a pawn i.e. a white pawn on e4 is always attacking d5 and f5. So maybe this won't add value. Maybe the attacks should be the sliding pieces.
— Steve
Bas
-
evangambit
- Posts: 1
- Joined: Sun Feb 22, 2026 4:05 am
- Full name: Morgan Redding
Re: NNUE incomplete?
IMO, anything that rarely changes, but is semantically meaningful, is an interesting candidate for incremental updates. Threats are a good example, and (as mentioned above) have already been used in successful engines.
My current implementation of a NNUE actually computes diffs "from scratch" -- this is technically slower than incrementally updating "the right thing(s)" after each move, but compared to the actual vector operations, computing these diffs is *nearly* free.
The code is basically
The upside is that writing the code this way makes it much easier and less error prone to try out new features (e.g. writing threat-diffing during make_move sounds awful), though I have a long TODO list of unrelated changes, so I haven't gotten around to experimenting.
Top of my list of things that seem like a (nearly) "free win" is different encodings for different flavors of the same piece type.
As a simple example, one could use {isolated pawns, connected pawns, other pawns} and learn embeddings for each (king location x pawn square x pawn type). This saves the NNUE from having to waste capacity figuring this out itself, and shouldn't have much overhead (the only additional NNUE increments are when a pawn's type changes, but the pawn itself didn't move). Other examples are "outposted knight vs normal knight", "rook on an open file vs rook on half-open file vs rook on closed file", etc.
My current implementation of a NNUE actually computes diffs "from scratch" -- this is technically slower than incrementally updating "the right thing(s)" after each move, but compared to the actual vector operations, computing these diffs is *nearly* free.
The code is basically
Code: Select all
Bitboard nnue_feature_to_bitboard(NnueFeatureBitmapType feature, const Position& pos) {
switch (feature) {
case NF_WHITE_PAWN:
return pos.pieceBitboards_[WHITE_PAWN];
case NF_WHITE_KNIGHT:
return pos.pieceBitboards_[WHITE_KNIGHT];
case NF_WHITE_BISHOP:
return pos.pieceBitboards_[WHITE_BISHOP];
...
}
Evaluation evaluate(const Position& pos) {
for (NnueFeatureBitmapType i = static_cast<NnueFeatureBitmapType>(0); i < NF_COUNT; i = static_cast<NnueFeatureBitmapType>(i + 1)) {
const Bitboard oldBitboard = lastFrame->pieceBitboards[i];
const Bitboard newBitboard = nnue_feature_to_bitboard(i, pos, threats);
Bitboard diff = oldBitboard & ~newBitboard;
while (diff) {
const SafeSquare sq = pop_lsb(diff);
nnue_model->decrement(
¤tFrame->whiteAcc, ¤tFrame->blackAcc, feature_index(i, sq));
}
diff = newBitboard & ~oldBitboard;
while (diff) {
const SafeSquare sq = pop_lsb(diff);
nnue_model->increment(
¤tFrame->whiteAcc, ¤tFrame->blackAcc, feature_index(i, sq));
}
currentFrame->pieceBitboards[i] = newBitboard;
}
// the rest of the forward pass
}
Top of my list of things that seem like a (nearly) "free win" is different encodings for different flavors of the same piece type.
As a simple example, one could use {isolated pawns, connected pawns, other pawns} and learn embeddings for each (king location x pawn square x pawn type). This saves the NNUE from having to waste capacity figuring this out itself, and shouldn't have much overhead (the only additional NNUE increments are when a pawn's type changes, but the pawn itself didn't move). Other examples are "outposted knight vs normal knight", "rook on an open file vs rook on half-open file vs rook on closed file", etc.
-
Wisely
- Posts: 10
- Joined: Sat Jul 06, 2024 7:39 am
- Full name: Ron Luke
Re: NNUE incomplete?
Sources say Komodo Dragon is AlphaZero in disguise. Simply Check "Use MCTS" then the Classical is now magically made stronger. No need to combine MCTS + NNUE. Ground-breaking technology. Google will have to work on MCTS for the True NNUE to be released to other engine authors, like me.
Kudos for testing this -5 Atk Chessmaster personality (called Trixie & Poplex). Roya!bee 10x8 Princess & Prince was a success.