Daniel Shawul wrote:...
To Jim:
I have sent the source code at the email address provided on your website.
Daniel
I can't see it anywhere. You already send me the source code. Weren't it a possibility to host it at sourceforge.net?
Volker
Moderator: Andres Valverde
Daniel Shawul wrote:...
To Jim:
I have sent the source code at the email address provided on your website.
Daniel
I can't see it anywhere. You already send me the source code. Weren't it a possibility to host it at sourceforge.net?
did send a second email with the attachement Confused May be it is in your bulk folder?
Daniel Shawul wrote:HI Andreas
Thanks for the suggestions! Now i have modified the source code to handle 64bit integers. Now defining ARC_64BIT at the begining of the scorpio.h will fix these problems. In eval it uses bitboards 64bits for
readreasing the attack tables so there won't be a problem. I am surprised i haven't noticed this while trying to compile 64 bit scorpioSurely earlier versions of 64bit scorpio never worked properly!
Daniel Shawul wrote:As to the book, I don't know why it doesn't work. I am not an expert in big endian/little endian stuff, but will there be a problem as long as I read/write in words?
regards
Daniel
PS: Scorpio 1.9 will be relased today,where i tried to fix the problem.
Not necessarily. Because to the different data models of 64-bit systems it can run fine on some systems/compilers and not on others. I'm not sure about windows, but 64-bit Mac uses LLP64 and have long still 32-bit and long long 64-bit, whereas 64-bit Unix/Linux uses ILP64 and have 64-bit sized long. Thus your source run fine on my Mac even when compiled in 64-bit, but not on Unix. Therefore it's good to just define int32, int64 etc at the beginning and use them in the code (like you did almost everywhere). If found this page useful to read about the different data types models:
http://www.unix.org/whitepapers/64bit.html
Normally, the endian problems go away if the book is created on the same architecture as it is used. So if your book author has the chance to have access to a Mac sometime, then he could create a book for the Mac.
By the way, word size on 32-bit and 64-bit systems may differ, so is the old book still compatible in 64-bit?
Daniel Shawul wrote:Yes the old code just assumed long to be 32 bits.
But i am confused about something else. Now in some places like for example the hashtable entry if i want a certain known size.
typedef struct tagHASHREC {
HASHKEY hash_key;
BMP32 move;
BMP16 score;
UBMP8 depth;
UBMP8 flags;
}HASHREC,*PHASHREC;
Now i am assuming that the total size is 128bits per entry. But the BMP32 could actually be 64bit and the BMP16 could be 32 bits on some systems. So it means i have different hashtable sizes. Using the word size of the specific processor is good for making fast lookups but i am not sure if this can compensate for the loss in performance due to decreased number of hashtable entries !?
regards
Daniel
Return to Programming and Technical Discussions
Users browsing this forum: No registered users and 3 guests