Wow - what a heap of halftruths...
>>>Preacher: Here's a thought: Why would you wanna run in 32-bit color? The human eye is only capable of distinguishing in the neighborhood of 1 million colors. Both 24 and 32 bit color go WAY beyond this.<<<
ROFLMAO. I cannot distinguish 24 from 32 bit color as there is _NO_ difference between the two. 32 bit color only uses 24 bits, the remaining 8 are completely wasted space. It is just that 32 bit is faster as it is machine sized compared to that odd 24 bit.
I can easily distinguis between 16 and 24 bit however. About everyone can I'd dare to say. You actually say it yourself as you claim that you need 1 Million colors, while 16 bit is only 65000 different ones. This can be seen easily when you try to do a smooth transition from one color to another ('rainbow'). You'll see 'massive' banding effects then. I agree that you won't notice it on pictures without smoot transitions however.
>>>Wedge009: That may be true, Preacher, but I'm sure the extra bits are used for things like transparancies. Similarly, the Human ear cannot distinguish wave resolutions of more than 16-bits, but recording studios still use 24-bit and 32-bit waves and dither back to 16-bit when they're done.<<<
No, they aren't used for tranparencies. At least not on the PC. While there is theorethical stuff like alpha channels this is not used in this encoding.
As far as this "dither back to 16-bit" goes - well that is obvious. You only compress data at the last possible step to avoid rounding errors.
>>>Wedge009:Too, the 32-bit extension is not just about colour depth, I think it's supposed to do other things like increase the viewing distance of asteroids.<<<
I think doesn't increase any distances AFAIK.
But as Quatro mentiones some artefacts disappear. There is a reason for that as well. IIRC our honored engine hacker
once said that he could not change the z-buffer to use 32 bit while keeping the rendering at 16 bit. So actually this is a different thing then color depth. It is just that for UE 32 bit color depth is connected to 32 bit "3D-depth " precision which reduces clipping errors a good deal.
Finally as far as the original poster goes, unfortunately I don't know what goes wrong there either. It might be that 32 bit only works on Nvidia cards?