Page 1 of 1

Posted: Mon Sep 27, 2004 3:36 am
by Baak
Hi All,

Just curious: in the networking file your primary/secondary player colors appear to be coded in bytes 0x374-0x379 / 0x37c-0x381 respectively.

What I don't understand is why they appear to have three extra bytes in each? As in:

RR ?? GG ?? BB ??

What's up with the extra bytes? Do they mean anything? Is this The Secret Key to the Mac?!? :;):

Posted: Mon Sep 27, 2004 5:44 am
by CIK
Here is the color struct used in the networking prefs

struct rgb_color
{
word red, green, blue;

word flags;
};

struct rgb_color primary_color, secondary_color;

NOTE: word is a 16 bit integer, so it would use 2 bytes.

Hope this helps you better understand what your looking at.

Posted: Mon Sep 27, 2004 12:18 pm
by Baak
Thanks CIK! :D

I had a feeling it was doing that, I've just never seen 16-bits for each color (R/G/B) - I'll have to investigate how to display this properly in another app I am making.

Is there a special name for this type of color-coding? A search for 16-bit RGB will probably assume it's 16-bits total and not for each color.

Thanks for any insights! :D

Posted: Tue Sep 28, 2004 6:20 am
by CIK
Here is how Bungie turns the 3 componets into a 32bit color for bitmap drawing. a is the alpha channel which is hardcode in Myth at various places.

#define RGBCOLOR_TO_PIXEL32(a,r,g,b) (((((pixel32)(a))<<16)&0xff000000) | ((((pixel32)®)<<8)&0x00ff0000) | ((((pixel32)(g)))&0x0000ff00) | ((((pixel32)(b))>>8)&0x000000ff))