bob wrote:DId you _read_ my last comment? Windows vista 64 bits + 64 bit gcc gives the expected sizeof(long) = 8. So I _did_ test windows. I don't use the microsoft C++ compiler so I have no idea what they do. But anyone that uses sizeof(long)=anything_other_than_8 on a 64 bit architecture is certainly brain-dead. But MS has never let that stop them in the past...
BTW the crafty code you quoted has been there since 1995 or so. More recent versions use long on HAS_64BITS architectures with Linux. I've never changed the windows stuff since __int64 has been around since 1995 as well...
You might want to read the C standard documents some times. Especially the C99 one which actually talk about 64 bit integers.
I actually have looked at this. Many times. And yes, I am aware that the standard does not say what "long" means, exactly, just that it means >= int in all cases.
However, for as long as I can remember, "long" has always been "the longest integer data type the hardware supports." Which is both logical and rational. Cray has always had int = long since it is a 64 bit architecture. With the quirk that pointers were always 32 bits because the a-registers (used for address computations) were 32 bits (eventually, early on they were 24 bits).
But for "long" on true 64 bit systems, I have yet to find a single case where long is < 64 bits, except, apparently some version of the microsoft C compiler (I have not verified which versions, if any, or if all, follow this rather odd practice). I did, long ago, verify that at least for linux, on any 64 bit platform I have tried, long = 64 bits. For X86_64, both gcc and icc follow this. And on my home windows vista 64 box, gcc does the same.
Not much more to say on the subject. Yes, the "standard" is lousy in many ways, such as why is an int always signed by default, while a char may be signed or unsigned at the whim of the compiler writer? "long long" was a non-standard extension that I used in 1995 in the first versions of Crafty when running on a 64 bit sun Sparc as well as on 32 bit X86/linux boxes.
short is 16 bits.
int is defined as the natural word size. It is *not* required to be 32 or 64 bits. It can be 16 bits, even. It's up to whatever the compiler writer chooses. Implementation defined. On a 32 bit compiler, it should be 32 bits, of course. On a 64 bit system, 64 bit int's would be the more appropriate size.
long is defined as at least 32 bits. On a 64 bit system, it can be either 32 or 64 bits.
long long is defined as at least 64 bits.
So it is entirely possible for 'int' to be 16 bits and 'long' at 32 bits and 'long long' at 64 bits. Not likely, of course. But possible and standard conforming.
That's why the C99 standard set up the stdint.h header to provide a nice portable way the programmer can be guaranteed of having the right size integers when they need it.
For C89, it doesn't even know about 64 bit integers, and the compiler writer is on his own and the programmer should actually check to make sure. In this case, 'int' is often 32 and 'long' 64 bits. But that violates the C convention / requirement that 'int' is the machine word size.
Of course, we've had this discussion before, and I'm well aware you don't like the C99 standard because not all compilers (<cough> Microsoft <cough>) support it.
But don't go saying that 'long' *must* be 64 bits because it doesn't.
I do not believe I said "it must be 64 bits". I rather said that it has been 64 bits on _every_ 64 bit architecture and compiler combination I have tried. You can look at the Crafty Makefile to see just how comprehensive that list of architectures and O/S choices is. Apparently there is / was / could-be some MS compiler that is/was brain-dead...
And that would even violate tradition and break a large number of programs that expect it at 32 bits. (Yes yes, I know programmer's shouldn't write a program that expects 'int' or 'long' to be a certain size without first checking, but many programs have done so in the past 25 years. Very few people these days actually bother to check to make sure a var type is the right size. They just assume 'int' is 32 bits and 'long' is probably 32 bits too.)
I'm not saying you should change your code because it's wrong or ugly or doesn't work or anything else. I'm just saying that 'long' does *NOT* have to be 64 bits. The C standard is pretty clear about that.
If you want to say that 'long' is 64 bits on the compilers you use and so that's what you program for, then that's a perfectly fine statement.
Actually, I think that _is_ what I said.