Code: Select all
ranges = [
(-2000, -401, '1_clearly_losing'),
(-400, -151, '2_slightly_losing'),
(-150, -101, '3_clearly_worse'),
(-100, -51, '4_slightly_worse'),
(-50, 50, '5_equal'),
(51, 100, '6_slightly_better'),
(101, 150, '7_clearly_better'),
(151, 400, '8_slightly_winning'),
(401, 2000, '9_clearly_winning')
]
Here is an example classification.
In round 2 of the Qatar masters, Alisher playing white defeated Magnus. After analyzing and classifying each position, confusion matrix for both players are then generated.
A. Alisher

We will read this matrix like this.
* The y-axis contains the sf16 class.
* The x-axis is for the player.
* At the top there is a row 5_equal, and there is 16 in the next cell, that means there are 16 positions that are considered by Sf16 as equal. And all of those were found by Alisher.
* The next item is the 6_slightly_better, there is 1 position and also found by Alisher.
* Next is slighly winning, 5 positions also found by Alisher.
* The last is clearly winning, 9 of them and all are found by Alisher.
B. Magnus

* Let's start from the bottom, at 5_equal, There are 16 positions that are equal, 14 of them is found by Magnus but two of them are slightly worse meaning there are 2 equal positions that Magnus failed to solve.
* There is 1 position that is slightly worse. Magnus failed to find the best and end up with a slightly losing position.
* There are 5 slightly losing positions, Magnus correctly find the 3, but failed the 2 resulting in 2 clearly losing positions.
C. Accuracy calculation
To calculate the overall accuracy of Magnus, add all the numbers in diagonal then divide it by the total numbers. That would be:
Code: Select all
accuracy = (8+3+0+14) / 30 = 0.8333 or 83.33%
Code: Select all
accuracy = (16+1+5+9) / 31 = 31/31 = 1 or 100%
Equal class:
Code: Select all
Magnus = (14) / (2+14) = 14/16 = 0.875 or 87.5%
Alisher = 16/16 = 1 or 100%