Re: Bit Flags

From: Mike Breuer (mbreuer@new.rr.com)
Date: 06/05/01


----- Original Message -----
From: "Patrick M. O'Laughlin" <pmolaughlin@HOME.COM>
Sent: Wednesday, June 06, 2001 12:30 AM


> I went ahead and changed the bitvectors to unsigned long long.  However,
> whenever I set one of the "over 31" flags, it also sets the flag that is
> flag minus 32.  What else do I need to do besides just change the variable
> definition?
>
> Thanks for your help thus far.

I'm already breaking my self-imposed "after midnight" rule, but here goes...

You are probably having a problem with integers not being converted to long
long.  For example, in the expression:

(1 << 32)

1 is assumed by the compiler to be an integer, and then left shifted 32
positions which should result in 0 on a 32-bit platform.  To get around this,
I created a macro:

#define ULL(x) ((unsigned long long)(x)) /* or ((bitvector_t)(x)) */

And then:

#define MY_FLAG (ULL(1) << 32)

There were also some spots in the olc code, and a few other places where I had
to add in the macro.

Mike

--
   +---------------------------------------------------------------+
   | FAQ: http://qsilver.queensu.ca/~fletchra/Circle/list-faq.html |
   | Archives: http://post.queensu.ca/listserv/wwwarch/circle.html |
   +---------------------------------------------------------------+



This archive was generated by hypermail 2b30 : 12/05/01 PST