1
I recently started trying to port my game to iOS. At first I couldn't event get SFML to run (I made a forum post about that issue here http://en.sfml-dev.org/forums/index.php?topic=20273.0, but I managed to get it to work. Now I've run into an odd issue, my game opens and runs the first time with the following errors:
These errors are somehow related to the GL context creation but I cant figure out where and how. Even though I get the errors the game runs fine.
After I close the game and open it again the game crashes after loading and starts printing these errors after the first two:
After this the RenderTarget.cpp (278) error is repeated. After I terminate the program, my phone becomes unresponsive and I need to restart it, if I dont it starts to get very very hot and I can't run the program again.
I should note that the game works fine on PC and Mac. Any help would be appreciated.
EDIT: I've tracked down the first known OpenGL issues, it occurs the instant a sf::Context is created and arises in GlContext::initalize()
The issues I've tracked down are:
1. glGetIntegerv(GL_MAJOR_VERSION, &majorVersion) and glGetIntegerv(GL_MINOR_VERSION, &minorVersion) set the values to 0, which means glGetString(GL_VERSION) is called
and the value received is "OpenGL ES 2.0 Apple A9 GPU - 77.14" and the proceeding operations set the major and minor versions to 31 and 53 respectively.
2. glGetIntegerv(GL_CONTEXT_FLAGS, &flags) returns the error GL_INVALID_ENUM.
An internal OpenGL call failed in Texture.cpp (64) : GL_INVALID_ENUM, an unacceptable value has been specified for an enumerated argument
An internal OpenGL call failed in Texture.cpp (556) : GL_INVALID_ENUM, an unacceptable value has been specified for an enumerated argument
An internal OpenGL call failed in Texture.cpp (556) : GL_INVALID_ENUM, an unacceptable value has been specified for an enumerated argument
These errors are somehow related to the GL context creation but I cant figure out where and how. Even though I get the errors the game runs fine.
After I close the game and open it again the game crashes after loading and starts printing these errors after the first two:
An internal OpenGL call failed in RenderTarget.cpp (98) : GL_INVALID_FRAMEBUFFER_OPERATION, the object bound to FRAMEBUFFER_BINDING is not "framebuffer complete"
An internal OpenGL call failed in RenderTarget.cpp (278) : GL_INVALID_FRAMEBUFFER_OPERATION, the object bound to FRAMEBUFFER_BINDING is not "framebuffer complete"
An internal OpenGL call failed in RenderTarget.cpp (278) : GL_INVALID_FRAMEBUFFER_OPERATION, the object bound to FRAMEBUFFER_BINDING is not "framebuffer complete"
After this the RenderTarget.cpp (278) error is repeated. After I terminate the program, my phone becomes unresponsive and I need to restart it, if I dont it starts to get very very hot and I can't run the program again.
I should note that the game works fine on PC and Mac. Any help would be appreciated.
EDIT: I've tracked down the first known OpenGL issues, it occurs the instant a sf::Context is created and arises in GlContext::initalize()
void GlContext::initialize()
{
// Activate the context
setActive(true);
// Retrieve the context version number
int majorVersion = 0;
int minorVersion = 0;
// Try the new way first
glGetIntegerv(GL_MAJOR_VERSION, &majorVersion);
glGetIntegerv(GL_MINOR_VERSION, &minorVersion);
if (glGetError() != GL_INVALID_ENUM)
{
m_settings.majorVersion = static_cast<unsigned int>(majorVersion);
m_settings.minorVersion = static_cast<unsigned int>(minorVersion);
}
else
{
// Try the old way
const GLubyte* version = glGetString(GL_VERSION);
if (version)
{
// The beginning of the returned string is "major.minor" (this is standard)
m_settings.majorVersion = version[0] - '0';
m_settings.minorVersion = version[2] - '0';
}
else
{
// Can't get the version number, assume 1.1
m_settings.majorVersion = 1;
m_settings.minorVersion = 1;
}
}
// 3.0 contexts only deprecate features, but do not remove them yet
// 3.1 contexts remove features if ARB_compatibility is not present
// 3.2+ contexts remove features only if a core profile is requested
// If the context was created with wglCreateContext, it is guaranteed to be compatibility.
// If a 3.0 context was created with wglCreateContextAttribsARB, it is guaranteed to be compatibility.
// If a 3.1 context was created with wglCreateContextAttribsARB, the compatibility flag
// is set only if ARB_compatibility is present
// If a 3.2+ context was created with wglCreateContextAttribsARB, the compatibility flag
// would have been set correctly already depending on whether ARB_create_context_profile is supported.
// If the user requests a 3.0 context, it will be a compatibility context regardless of the requested profile.
// If the user requests a 3.1 context and its creation was successful, the specification
// states that it will not be a compatibility profile context regardless of the requested
// profile unless ARB_compatibility is present.
m_settings.attributeFlags = ContextSettings::Default;
if (m_settings.majorVersion >= 3)
{
// Retrieve the context flags
int flags = 0;
glGetIntegerv(GL_CONTEXT_FLAGS, &flags);
if (flags & GL_CONTEXT_FLAG_DEBUG_BIT)
m_settings.attributeFlags |= ContextSettings::Debug;
if ((m_settings.majorVersion == 3) && (m_settings.minorVersion == 1))
{
m_settings.attributeFlags |= ContextSettings::Core;
glGetStringiFuncType glGetStringiFunc = reinterpret_cast<glGetStringiFuncType>(getFunction("glGetStringi"));
if (glGetStringiFunc)
{
int numExtensions = 0;
glGetIntegerv(GL_NUM_EXTENSIONS, &numExtensions);
for (unsigned int i = 0; i < static_cast<unsigned int>(numExtensions); ++i)
{
const char* extensionString = reinterpret_cast<const char*>(glGetStringiFunc(GL_EXTENSIONS, i));
if (std::strstr(extensionString, "GL_ARB_compatibility"))
{
m_settings.attributeFlags &= ~static_cast<Uint32>(ContextSettings::Core);
break;
}
}
}
}
else if ((m_settings.majorVersion > 3) || (m_settings.minorVersion >= 2))
{
// Retrieve the context profile
int profile = 0;
glGetIntegerv(GL_CONTEXT_PROFILE_MASK, &profile);
if (profile & GL_CONTEXT_CORE_PROFILE_BIT)
m_settings.attributeFlags |= ContextSettings::Core;
}
}
// Enable antialiasing if needed
if (m_settings.antialiasingLevel > 0)
glEnable(GL_MULTISAMPLE);
}
{
// Activate the context
setActive(true);
// Retrieve the context version number
int majorVersion = 0;
int minorVersion = 0;
// Try the new way first
glGetIntegerv(GL_MAJOR_VERSION, &majorVersion);
glGetIntegerv(GL_MINOR_VERSION, &minorVersion);
if (glGetError() != GL_INVALID_ENUM)
{
m_settings.majorVersion = static_cast<unsigned int>(majorVersion);
m_settings.minorVersion = static_cast<unsigned int>(minorVersion);
}
else
{
// Try the old way
const GLubyte* version = glGetString(GL_VERSION);
if (version)
{
// The beginning of the returned string is "major.minor" (this is standard)
m_settings.majorVersion = version[0] - '0';
m_settings.minorVersion = version[2] - '0';
}
else
{
// Can't get the version number, assume 1.1
m_settings.majorVersion = 1;
m_settings.minorVersion = 1;
}
}
// 3.0 contexts only deprecate features, but do not remove them yet
// 3.1 contexts remove features if ARB_compatibility is not present
// 3.2+ contexts remove features only if a core profile is requested
// If the context was created with wglCreateContext, it is guaranteed to be compatibility.
// If a 3.0 context was created with wglCreateContextAttribsARB, it is guaranteed to be compatibility.
// If a 3.1 context was created with wglCreateContextAttribsARB, the compatibility flag
// is set only if ARB_compatibility is present
// If a 3.2+ context was created with wglCreateContextAttribsARB, the compatibility flag
// would have been set correctly already depending on whether ARB_create_context_profile is supported.
// If the user requests a 3.0 context, it will be a compatibility context regardless of the requested profile.
// If the user requests a 3.1 context and its creation was successful, the specification
// states that it will not be a compatibility profile context regardless of the requested
// profile unless ARB_compatibility is present.
m_settings.attributeFlags = ContextSettings::Default;
if (m_settings.majorVersion >= 3)
{
// Retrieve the context flags
int flags = 0;
glGetIntegerv(GL_CONTEXT_FLAGS, &flags);
if (flags & GL_CONTEXT_FLAG_DEBUG_BIT)
m_settings.attributeFlags |= ContextSettings::Debug;
if ((m_settings.majorVersion == 3) && (m_settings.minorVersion == 1))
{
m_settings.attributeFlags |= ContextSettings::Core;
glGetStringiFuncType glGetStringiFunc = reinterpret_cast<glGetStringiFuncType>(getFunction("glGetStringi"));
if (glGetStringiFunc)
{
int numExtensions = 0;
glGetIntegerv(GL_NUM_EXTENSIONS, &numExtensions);
for (unsigned int i = 0; i < static_cast<unsigned int>(numExtensions); ++i)
{
const char* extensionString = reinterpret_cast<const char*>(glGetStringiFunc(GL_EXTENSIONS, i));
if (std::strstr(extensionString, "GL_ARB_compatibility"))
{
m_settings.attributeFlags &= ~static_cast<Uint32>(ContextSettings::Core);
break;
}
}
}
}
else if ((m_settings.majorVersion > 3) || (m_settings.minorVersion >= 2))
{
// Retrieve the context profile
int profile = 0;
glGetIntegerv(GL_CONTEXT_PROFILE_MASK, &profile);
if (profile & GL_CONTEXT_CORE_PROFILE_BIT)
m_settings.attributeFlags |= ContextSettings::Core;
}
}
// Enable antialiasing if needed
if (m_settings.antialiasingLevel > 0)
glEnable(GL_MULTISAMPLE);
}
The issues I've tracked down are:
1. glGetIntegerv(GL_MAJOR_VERSION, &majorVersion) and glGetIntegerv(GL_MINOR_VERSION, &minorVersion) set the values to 0, which means glGetString(GL_VERSION) is called
and the value received is "OpenGL ES 2.0 Apple A9 GPU - 77.14" and the proceeding operations set the major and minor versions to 31 and 53 respectively.
2. glGetIntegerv(GL_CONTEXT_FLAGS, &flags) returns the error GL_INVALID_ENUM.