Actually, it looks like it comes from a cast from the defined TIOCMBIS/TIOCMBIC to signed int then back to unsigned long. The compiler believes we ask him to keep the integer value and not the bytes :
printf("%ld %ld\n", sizeof(int), sizeof(unsigned long));
4 8
printf("%lx %lx\n", TIOCMBIS, TIOCMBIC);
8004746c 8004746b
unsigned long a = TIOCMBIS; unsigned long b = TIOCMBIC; printf("%lx %lx\n", a, b);
8004746c 8004746b
int c = TIOCMBIS; int d = TIOCMBIC; printf("%x %x\n", c, d); printf("%lx %lx\n", (unsigned long)c, (unsigned long)d);
8004746c 8004746b ffffffff8004746c ffffffff8004746b
unsigned int e = TIOCMBIS; unsigned int f = TIOCMBIC; printf("%x %x\n", e, f); printf("%lx %lx\n", (unsigned long)e, (unsigned long)f);
8004746c 8004746b 8004746c 8004746b
Linux has much lower values (sticking to 2 bytes) for these constants (0x5416 and 0x5417), so this unwanted conversion should not happen in 64bit, so we should not have the same issue with a Linux 64bit.
Pascal
On 07 Jul 2012, at 11:09, Henrik wrote:
On 07.07.12 10:59, Jef Driesen wrote:
I'm not seeing any problems on 64bit linux, but maybe the actual value fits into an int, and on macosx it does not? Can someone check the actual values for TIOCMBIC and TIOCMBIS on 32 and 64bit macosx (using unsigned long of course).
Is this the way to do it?
printf("%ld %ld\n", TIOCMBIS, TIOCMBIC);
In that case, I've got this for 32bit:
-2147191700 -2147191701
... and this for 64bit:
2147775596 2147775595
Henrik