a common advice to improve an engine is to remove "unusual" code. Now I'm scanning through my search code and find a couple of issues which I would like to clarify. Although currently I don't think these are really "unusual" I would like to get some opinions about it.
1. Many people seem to use either PVS or plain AlphaBeta. I use FAB (Fail-soft AlphaBeta) because I read that it is sometimes superior to plain AlphaBeta (it returns tighter bounds when an iteration fails high or low), it always searches the same nodes as AlphaBeta when given the same window, and - unfortunately - I have been too lazy up to now to rewrite my search using PVS.
Is anyone else using FAB, or does anyone have severe objections against it?
2. I use a small optimization (at least I think it is one) when defining the alpha-beta window and also when deciding about conditons of a repeated iteration after fail high/fail low. I determine minimum and maximum possible values of a position p given an iteration number i as follows:
- Code: Select all
int minPossibleValue = hasMatingMaterial(p.opponent) ? -(MAX_VALUE-i) : 0;
int maxPossibleValue = hasMatingMaterial(p.side) ? MAX_VALUE-i : 0;
// in reality it's even more precise, e.g. 'side' can only mate in an odd number of plies,
// but I omit this here for simplicity
ASSERT(maxPossibleValue >= minPossibleValue); // of course
I do this mainly to save a little effort when a forced mate is found. The 'hasMatingMaterial' stuff seems to help in some special endgames where one side can't win anymore but the opponent struggles to win the game because the score often drops close to a draw.
The basic assumption is that a mate in less than i plies should (must) be found during one of the previous iterations, so the current iteration must not return a mate shorter than i. I know this might be broken by some aggressive pruning methods but up to now I don't use them.
I initialize the window with
- Code: Select all
unsigned int const windowSize = 30; // for example
int alpha = max(minPossibleValue, bestValueOfLastIteration-windowSize);
int beta = min(maxPossibleValue, bestValueOfLastIteration+windowSize);
Finally, when an iteration fails low or high, I apply the following modified technique:
- Code: Select all
int v = falphabeta(p, alpha, beta, maxDepth);
if (v >= beta && v < maxPossibleValue) v = falphabeta(p, v, maxPossibleValue, maxDepth);
else
if (v <= alpha && v > minPossibleValue) v = falphabeta(p, minPossibleValue, v, maxDepth);
This also allows to stop the whole search when (v == minPossibleValue && v < 0) or (v == maxPossibleValue && v > 0) because this indicates that the shortest possible mate has been found and no further iteration is necessary.
My construction implies a slightly extended understanding of a beta cutoff during search. The standard case is that at any node deep in the tree, v >= beta means that the current node's value will have no impact on the value of the root node. My extension includes that beta may also be maxPossibleValue (of the root node or even of the current node) in which case v >= beta may also mean that no further search is necessary because no further move may turn out to get a better value than v, but it is not necessarily a refutation of the opponent's previous move.
Now when writing this description, I feel that there may be some important points which I missed and which might classify my modifications as "unusual". But I can't yet see them.
Any opinions about this?
Thanks in advance,
Sven