Can't run in 32bit color

U

Unregistered

Guest
for all of the resolutions in 32bit color, I get this error message:

GL_d3d_5::createZBuffer attach failed 87


This is with a Radeon 9700pro. Tried it with and without AA & AF.

Any other thoughts??
 
I don't think the 32-bit extension works for everyone. Maybe it has something to do with having Glide working as well, I don't know.
 
Originally posted by Unregistered
for all of the resolutions in 32bit color, I get this error message:

GL_d3d_5::createZBuffer attach failed 87


This is with a Radeon 9700pro. Tried it with and without AA & AF.

Any other thoughts??
Here's a thought: Why would you wanna run in 32-bit color? The human eye is only capable of distinguishing in the neighborhood of 1 million colors. Both 24 and 32 bit color go WAY beyond this.

In other words, 16-bit color is the highest anyone needs to go. Anything beyond that is wasted, not to mention the fact that higher color depths suck more of your GPU/CPU's resources, and (as you're finding out) can be more problematic as well.
 
That may be true, Preacher, but I'm sure the extra bits are used for things like transparancies. Similarly, the Human ear cannot distinguish wave resolutions of more than 16-bits, but recording studios still use 24-bit and 32-bit waves and dither back to 16-bit when they're done.

Too, the 32-bit extension is not just about colour depth, I think it's supposed to do other things like increase the viewing distance of asteroids.
 
Re: Re: Can't run in 32bit color

Originally posted by Preacher
In other words, 16-bit color is the highest anyone needs to go. Anything beyond that is wasted, not to mention the fact that higher color depths suck more of your GPU/CPU's resources, and (as you're finding out) can be more problematic as well.
Nonetheless, 32-bit mode does look better. In 16-bit mode, parts of objects are sometimes drawn incorrectly in WCP - so, for example, a turret might appear to be partially inside the capship; an edge at an angle to the screen might appear to be a zigzag instead of a straight line. In 32-bit mode, these sort of problems disappear.
 
Wow - what a heap of halftruths...

>>>Preacher: Here's a thought: Why would you wanna run in 32-bit color? The human eye is only capable of distinguishing in the neighborhood of 1 million colors. Both 24 and 32 bit color go WAY beyond this.<<<

ROFLMAO. I cannot distinguish 24 from 32 bit color as there is _NO_ difference between the two. 32 bit color only uses 24 bits, the remaining 8 are completely wasted space. It is just that 32 bit is faster as it is machine sized compared to that odd 24 bit.
I can easily distinguis between 16 and 24 bit however. About everyone can I'd dare to say. You actually say it yourself as you claim that you need 1 Million colors, while 16 bit is only 65000 different ones. This can be seen easily when you try to do a smooth transition from one color to another ('rainbow'). You'll see 'massive' banding effects then. I agree that you won't notice it on pictures without smoot transitions however.

>>>Wedge009: That may be true, Preacher, but I'm sure the extra bits are used for things like transparancies. Similarly, the Human ear cannot distinguish wave resolutions of more than 16-bits, but recording studios still use 24-bit and 32-bit waves and dither back to 16-bit when they're done.<<<

No, they aren't used for tranparencies. At least not on the PC. While there is theorethical stuff like alpha channels this is not used in this encoding.
As far as this "dither back to 16-bit" goes - well that is obvious. You only compress data at the last possible step to avoid rounding errors.

>>>Wedge009:Too, the 32-bit extension is not just about colour depth, I think it's supposed to do other things like increase the viewing distance of asteroids.<<<

I think doesn't increase any distances AFAIK.
But as Quatro mentiones some artefacts disappear. There is a reason for that as well. IIRC our honored engine hacker ;) once said that he could not change the z-buffer to use 32 bit while keeping the rendering at 16 bit. So actually this is a different thing then color depth. It is just that for UE 32 bit color depth is connected to 32 bit "3D-depth " precision which reduces clipping errors a good deal.

Finally as far as the original poster goes, unfortunately I don't know what goes wrong there either. It might be that 32 bit only works on Nvidia cards?
 
Originally posted by cff
You actually say it yourself as you claim that you need 1 Million colors, while 16 bit is only 65000 different ones.
Wow, can't believe I missed that. There's 1.6 million colours with 24 bits.

Originally posted by cff
No, they aren't used for tranparencies. At least not on the PC. While there is theorethical stuff like alpha channels this is not used in this encoding.
Whoops. I just meant that I thought the extra colours would be used for transparancy effects, not the extra bits. I was thinking about the SNES - it was capaple of 16-bit colour depth, but usually only used 8 bits - 256 colours. The extra colours were needed for translucency effects.
 
Back
Top