Welcome, Guest. Please login or register. Did you miss your activation email?

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - gravgun

Pages: [1]
1
Window / Re: OpenGL 1/2 context parameters aren't passed on Linux
« on: September 15, 2013, 02:32:23 pm »
The problem is that the way it selects the best visual is not suited here, example:
GlContext::evaluateFormat check for best fitting context, but can produce unwanted output; and my Intel chipset can do antialias, so that's not the problem.
Good case:
Want: 32bpp, 16 bits depth, 8 bits stencil, x4 antialias
Have: 32bpp, 16 bits depth, 8 bits stencil, x4 antialias
      0   +  0      +       0       +        0 = 0, best, everything is OK


Bad case:
Want: 32bpp, 8 bits depth, 8 bits stencil, x2 antialias
Have: 32bpp, 8 bits depth, 8 bits stencil, x0 antialias
      0   +  0      +       0       +       2 = 2, gets chosen but I want antialias
      32bpp, 16 bits depth, 8 bits stencil, x2 antialias
      0   +  8      +       0       +        0 = 8

Even if values seem illogical to you, I seem to be in the bad case (I haven't got the actual values though).
This behavior is due to the std::abses in GlContext::evaluateFormat, that makes the rating worse if the avaliable settings is actually better; which is both a bad and good thing: it gets the closest context settings, but better isn't necessarily worse.

As I said, using the OpenGL 3/4 init code does work for 1.x/2.x; and even if it fails the fallback will still be available to get the closest settings possible. So I think it'd be better trying to create a context with exact options then falling back, regardless of the requested OpenGL version.

2
Hello everyone,
when I try to apply antialias on a sf::ContextSettings and pass it to sf::RenderWindow, it doesn't work; i just get a default OpenGL 1.x/2.x context.
I don't think it is the expected behavior, how do I get to enable antialias then?

The culprit is src/SFML/Window/Linux/GlxContext.cpp, where an test is being done; introduced by Laurent when he fixed bug #258:
// Create the OpenGL context -- first try context versions >= 3.0 if it is requested (they require special code)
if (m_settings.majorVersion >= 3)
{
and the initialization code passes everything to glXCreateContextAttribsARB, but this test restricts all of this to OpenGL 3+.
And what does OpenGL 1.x/2.x get? Nothing. Basically I removed this test (and && (m_settings.majorVersion >= 3) too), and the whole thing worked: initialization for OpenGL 1.x/2.x is done the same way as 3+...

Could this get fixed please?

Pages: [1]
anything