Moderator: Andres Valverde
Roman Hartmann wrote:
So what's the best solution for an always overwrite scheme?
best regards
Roman
Roman Hartmann wrote:Hi,
finally I implemented a transposition table in my engine. It works and does help in the endgame quite a bit. There is one little problem though which I'm not sure how to handle.
If I have position like Fine #70 the key move is shown after about 1s at depth 25 with a convincing score. Everything as expected but as soon as I make a move I have a lot of cut-offs due now outdated data in the hash table and the search is messed up. This is easy to solve by clearing the hash-table after a move is made and a new search is started, of course, but then I will also lose the advantage of a transposition table in the middle game.
Actually I have this problem already fixed by clearing the transposition table only in the endgame phase. This works but looks like an ugly hack to me.
So what's the best solution for an always overwrite sheme?
best regards
Roman
Why would you need to do that? Hash entries are stored with the depth they were searched with, which is independent of the root position. Don't decrement by 2, it's a waste of time and efficiency.Ron Murawski wrote:Many engines have several overwrite entries and one depth-based entry for each hash table slot. The hash table needs all its depth-based entries decremented by 2 after the engine decides on its move. (Because when it is the engine's turn again, the depth-based entries in the hash are from the last search, which was 2 plies earlier.)
Will it be possible to post the Fine#70 position? I use depth+age to decide replacement in TT. I am interested to see how this scheme fares in this position.Roman Hartmann wrote:If I have position like Fine #70 the key move is shown after about 1s at depth 25 with a convincing score. Everything as expected but as soon as I make a move I have a lot of cut-offs due now outdated data in the hash table and the search is messed up. This is easy to solve by clearing the hash-table after a move is made and a new search is started, of course, but then I will also lose the advantage of a transposition table in the middle game.
Actually I have this problem already fixed by clearing the transposition table only in the endgame phase. This works but looks like an ugly hack to me.
Roman Hartmann wrote:
If I have position like Fine #70 the key move is shown after about 1s at depth 25 with a convincing score. Everything as expected but as soon as I make a move I have a lot of cut-offs due now outdated data in the hash table and the search is messed up.
This is easy to solve by clearing the hash-table after a move is made and a new search is started, of course, but then I will also lose the advantage of a transposition table in the middle game.
Actually I have this problem already fixed by clearing the transposition table only in the endgame phase. This works but looks like an ugly hack to me.
So what's the best solution for an always overwrite sheme?
The best solution is to always overwrite!
How many entries do you have for each hash slot?
If your search gets messed up then you have a bug. The worst thing that should ever happen is for the search to run slower and if it is simply a forward position from the current node, we should not see that happen either.
Do you have a cross-check piece of the hash key in your hash table element?
It sounds like you don't update your hash key properly in the root (when you do the move at game level).
Potential bug : you are supposed to store the search depth in the entries, ie the depth of the subtree growing from this position. You must not use the relative depth from the root of your search.
I don't know what end game or middle game have to do with it.
Zach Wegner wrote:Why would you need to do that? Hash entries are stored with the depth they were searched with, which is independent of the root position. Don't decrement by 2, it's a waste of time and efficiency.Ron Murawski wrote:Many engines have several overwrite entries and one depth-based entry for each hash table slot. The hash table needs all its depth-based entries decremented by 2 after the engine decides on its move. (Because when it is the engine's turn again, the depth-based entries in the hash are from the last search, which was 2 plies earlier.)
Zach Wegner wrote:OK, that's something different from what I thought you were saying. I got the impression that you only stored the depth in the hash table, and that you ran through the entire table after each move and decremented each entry by 2.
Zach Wegner wrote:What you say is quite different. What I, and I think most others, do is to have a separate field in the hash table of just "age", which would be something like the move number. Then you base replacement decisions on age and depth, but you only base cutoff decisions based on depth. This is cleaner IMO and involves minimal loss of information.
Roman Hartmann wrote:I guess I found the problem. As usual it is something else than expected.
My 3-fold rep-detection is polluting my hash table with lot of 0-values. If I disable any cut-off if the returned value is 0 the hash table works as expected.
So it seems I have to solve another question now: How do I prevent the 3-fold rep-detection from polluting my hash table with all those zeros?
I'm using a hash-stack for rep-detection and if there are 2 identical positions in the tree or in the game history a draw score (0) is returned. A draw score is also returned if a position occurs in the tree which is already in the game history.
Not storing a 0 value in the hash table looks like a simple solution to my problem. But if there are any better solutions I would like to know about it.
best regards
Roman
No, there's a loss of information when you make the cutoff decision. Imagine a 12 ply search from the starting position. After the e4 e5 Nf3 Nc6 subtree is searched (to 8 ply), you store the result in the hashtable with draft 0 + (8 * 8). Then, after e4 e5 are played, you are doing a 10 ply search, and you hit the same position with a depth of 8 left. You're only going to accept hash entries with a draft of 2 + (8 * 8) or more, and the old entry won't cause a cutoff, when it should.Ron Murawski wrote:There's no loss of information combining these terms. For any depth-based replacement decision using an aging scheme, you need to calculate a "relative depth" using depth and move number. The choice is whether to store two numbers in the hash (to make the calculation) or simply store the calculated result in the table. It's important to realize that the real depth doesn't matter. it's the calculated "relative depth" that determines the replacement decision.
Zach Wegner wrote:No, there's a loss of information when you make the cutoff decision. Imagine a 12 ply search from the starting position. After the e4 e5 Nf3 Nc6 subtree is searched (to 8 ply), you store the result in the hashtable with draft 0 + (8 * 8). Then, after e4 e5 are played, you are doing a 10 ply search, and you hit the same position with a depth of 8 left. You're only going to accept hash entries with a draft of 2 + (8 * 8) or more, and the old entry won't cause a cutoff, when it should.Ron Murawski wrote:There's no loss of information combining these terms. For any depth-based replacement decision using an aging scheme, you need to calculate a "relative depth" using depth and move number. The choice is whether to store two numbers in the hash (to make the calculation) or simply store the calculated result in the table. It's important to realize that the real depth doesn't matter. it's the calculated "relative depth" that determines the replacement decision.
OK, I see. There is a loss of information here though. When you ignore the fractional plies, that means that you're rounding down at some point. If the move numbers were not 0 and 2 but 3 and 4, then you would lose a full ply of useful hash information. This is also dangerous in that fractional plies can make a difference in the search, and you can accept cutoffs from entries with too low of a draft. This last part I haven't tested personally, but I was told that by Bob Hyatt and Tord Romstad a few years ago on CCC, and it makes sense to me. A quarter or eighth of a ply here, and it ends up allowing a full ply later down the tree, which leads to seeing a checkmate earlier, and so on. It's also just a philosophical thing for me, I would rather not lose some good cutoffs and make some bad ones just to save some bits in the hash table.Ron Murawski wrote:You are in error when you assume I would only accept hash entries with a draft of 2 + (8 * 8) or more. Again, this is my fault because I'm not being clear enough. The aging is 1/4 ply per game ply, or 1/2 ply per chess game move. The integer depth is unaffected until the 1/4 plies add up to a whole ply. (For pruning decisions I ignore fractional plies.)
That is of course true. It's quite possible that there's no real measurable difference between the methods. If it works fine for you, then there's no reason to mess with it. But in this case I think it would be worth testing, it might bring you some precious Elo.In my testing it was quite rare to find missed pruning decisions. After playing several thousand test games I decided that the method worked fine in practice.
Maybe someday I'll re-try your method and see if it makes any difference.
Zach, you said you base replacement decisions on age and depth --
What is your formula?
Ron
best_slot = 0;
best_depth = MAX_PLY * PLY + 1024;
for (x = 0; x < HASH_SLOT_COUNT; x++)
{
hashkey = entry->entry[x].hashkey;
data = entry->entry[x].data;
if (hashkeys_match())
{
if (HASH_DEPTH(data) > sb->depth)
return;
else
{
best_slot = x;
break;
}
}
else if (HASH_DEPTH(data) - age_difference(HASH_AGE(data)) * 2 * PLY < best_depth)
{
best_slot = x;
best_depth = HASH_DEPTH(data) - age_difference(HASH_AGE(data)) * 2 * PLY;
}
}
/* Check to see if we should overwrite the entry. */
if (HASH_TYPE(entry->entry[best_slot].data) != HASH_EXACT_BOUND || type == HASH_EXACT_BOUND ||
age_difference(HASH_AGE(entry->entry[best_slot].data)) > 0)
{
store_position();
}
}
Return to Programming and Technical Discussions
Users browsing this forum: No registered users and 35 guests