What graphics card do you use then?
nVidia GeForce 9300M GS, a bit old but I'm sure it can support at least 32 bit output ^^
Also, the mac on which I tested the program (where it printed
had a GeForce 9400M or something like that.
I get 0 as well.
Thanks for testing, glad to know I'm not the only one with this problem
Ok, I just put some debug in GlxContext::createContext, which calls evaluateFormat.
Here is the interesting part:
// Get the attributes of the target window
XWindowAttributes windowAttributes;
if (XGetWindowAttributes(m_display, m_window, &windowAttributes) == 0)
{
err() << "Failed to get the window attributes" << std::endl;
return;
}
// Setup the visual infos to match
XVisualInfo tpl;
tpl.depth = windowAttributes.depth;
tpl.visualid = XVisualIDFromVisual(windowAttributes.visual);
tpl.screen = DefaultScreen(m_display);
// Get all the visuals matching the template
int nbVisuals = 0;
XVisualInfo* visuals = XGetVisualInfo(m_display, VisualDepthMask | VisualIDMask | VisualScreenMask, &tpl, &nbVisuals);
I don't know how this works, but it turns out that windowAttributs.depth (and thus tpl.depth) is 24. Then XGetVisualInfo gives only one format which is 24 bits.
Thanks binary1248 for mentioning evaluateFormat which lead me to this function