Extra hex codes for colors? - How does six bytes make a color?

Talk about anything here.
Post Reply
User avatar
Baak
Posts: 1109
Joined: Sat Mar 20, 2004 6:26 pm
Location: Mything

Post by Baak »

Hi All,

Just curious: in the networking file your primary/secondary player colors appear to be coded in bytes 0x374-0x379 / 0x37c-0x381 respectively.

What I don't understand is why they appear to have three extra bytes in each? As in:

RR ?? GG ?? BB ??

What's up with the extra bytes? Do they mean anything? Is this The Secret Key to the Mac?!? :;):
CIK
Posts: 1127
Joined: Mon Mar 01, 2004 9:08 pm

Post by CIK »

Here is the color struct used in the networking prefs

struct rgb_color
{
word red, green, blue;

word flags;
};

struct rgb_color primary_color, secondary_color;

NOTE: word is a 16 bit integer, so it would use 2 bytes.

Hope this helps you better understand what your looking at.
User avatar
Baak
Posts: 1109
Joined: Sat Mar 20, 2004 6:26 pm
Location: Mything

Post by Baak »

Thanks CIK! :D

I had a feeling it was doing that, I've just never seen 16-bits for each color (R/G/B) - I'll have to investigate how to display this properly in another app I am making.

Is there a special name for this type of color-coding? A search for 16-bit RGB will probably assume it's 16-bits total and not for each color.

Thanks for any insights! :D
CIK
Posts: 1127
Joined: Mon Mar 01, 2004 9:08 pm

Post by CIK »

Here is how Bungie turns the 3 componets into a 32bit color for bitmap drawing. a is the alpha channel which is hardcode in Myth at various places.

#define RGBCOLOR_TO_PIXEL32(a,r,g,b) (((((pixel32)(a))<<16)&0xff000000) | ((((pixel32)®)<<8)&0x00ff0000) | ((((pixel32)(g)))&0x0000ff00) | ((((pixel32)(b))>>8)&0x000000ff))
Post Reply