SFML community forums
Help => Window => Topic started by: mercurio7891 on March 31, 2011, 09:03:54 pm
-
Hi, in OSX for sfml2, how do i create a window, close it, and recreate it again??
I did something like:
////////////////////////////////////////////////////////////
// Headers
////////////////////////////////////////////////////////////
#include <cstdlib>
#include <cstdio>
#include <iostream>
#include <SFML/System.hpp>
#include <SFML/Window.hpp>
#include <SFML/OpenGL.hpp>
////////////////////////////////////////////////////////////
/// Entry point of application
///
/// \return Application exit code
///
////////////////////////////////////////////////////////////
void WindowFunction()
{
// Create the main window
sf::ContextSettings Settings;
Settings.DepthBits = 24; // Request a 24 bits depth buffer
Settings.StencilBits = 8; // Request a 8 bits stencil buffer
Settings.AntialiasingLevel = 0; // Request 2 levels of antialiasing
Settings.MajorVersion = 3;
Settings.MinorVersion = 3;
sf::Window App(sf::VideoMode(1024, 768, 32), "SFML OpenGL", sf::Style::Close, Settings);
Settings = App.GetSettings();
std::cout << "DepthBits " << Settings.DepthBits << "\n";
std::cout << "StencilBits " << Settings.StencilBits << "\n";
std::cout << "Antialiasing Level " << Settings.AntialiasingLevel << "\n";
std::cout << "Opengl Version " << Settings.MajorVersion << "." << Settings.MinorVersion << "\n";
// Set color and depth clear value
glClearDepth(1.f);
glClearColor(0.176f, 0.196f, 0.667f, 0.0f);
// Enable Z-buffer read and write
glEnable(GL_DEPTH_TEST);
glDepthMask(GL_TRUE);
// Setup a perspective projection
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(90.f, 1.f, 1.f, 500.f);
// Start game loop
while (App.IsOpened())
{
// Process events
sf::Event Event;
while (App.GetEvent(Event))
{
// Close window : exit
if (Event.Type == sf::Event::Closed)
App.Close();
// Escape key : exit
if ((Event.Type == sf::Event::KeyPressed) && (Event.Key.Code == sf::Key::Escape))
App.Close();
// Resize event : adjust viewport
if (Event.Type == sf::Event::Resized)
glViewport(0, 0, Event.Size.Width, Event.Size.Height);
}
// Set the active window before using OpenGL commands
// It's useless here because active window is always the same,
// but don't forget it if you use multiple windows or controls
//App.SetActive(true);
// Clear color and depth buffer
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Finally, display rendered frame on screen
App.Display();
//App.SetActive(false);
}
}
int main()
{
WindowFunction();
WindowFunction();
return EXIT_SUCCESS;
}
if i press 'esc' to close the first window, it aborts with a seg fault. If I press the 'cross' button at the top left corner of the first window, it aborts with a pure virtual function call error.
regards
-
You shouldn't be using those OpenGL calls directly. The glClear can be done with sf::RenderWindow.Clear();, and the glViewport can be handled using sf::View. I'm thinking the error with escape is because you're calling glClear when the window is closed. Not sure about the other one.
-
i tried removing all the GL calls as suggested and it is still causing the same error. I don't really think it is glclear which causes the error, i think opengl works on a state system. So calling glClear consecutively multiple times might possibly be shorten to a single glClear by the driver. Not sure though, possibly Hiura might have an answer :)
-
I think it's completely unrelated to any OpenGL stuff. When I run you code (by the way thank you for the minimal and complete code!) I get some pure virtual method called.
Here the backtrace (second call of WindowFunction) :
(gdb) backtrace
#0 0x00007fff859c25d6 in __kill ()
#1 0x00007fff85a62cd6 in abort ()
#2 0x00007fff83ba35d2 in __gnu_cxx::__verbose_terminate_handler ()
#3 0x00007fff83ba1ae1 in __cxxabiv1::__terminate ()
#4 0x00007fff83ba1b16 in std::terminate ()
#5 0x00007fff83ba1fd6 in __cxa_pure_virtual ()
#6 0x0000000100023835 in sf::priv::GlContext::SetActive (this=0x100112ad0, active=true) at sfml_git/src/SFML/Window/GlContext.cpp:176
#7 0x000000010002388d in sf::priv::GlContext::SetActive (this=0x10030ebd0, active=false) at sfml_git/src/SFML/Window/GlContext.cpp:199
#8 0x000000010002399a in sf::priv::GlContext::Initialize () at sfml_git/src/SFML/Window/GlContext.cpp:101
#9 0x0000000100024aec in sf::GlResource::GlResource (this=0x7fff5fbff1f0) at sfml_git/src/SFML/Window/GlResource.cpp:57
#10 0x0000000100027fa9 in sf::Window::Window (this=0x7fff5fbff1f0, mode={Width = 1024, Height = 768, BitsPerPixel = 32}, title=@0x7fff5fbff590, style=4, settings=@0x7fff5fbff550) at sfml_git/src/SFML/Window/Window.cpp:66
#11 0x000000010000167e in WindowFunction () at SFML2_tests/openWindow/main.cpp:26
#12 0x00000001000019dd in main () at SFML2_tests/openWindow/main.cpp:89
(Does this code runs fine on Windows and Linux ?)
According to Meyers we should "Never call virtual functions during construction or destruction." So I'm thinking it's the design of GlContext (sub)classes that causes this crash. What do you think, Laurent ?
-
it works on windows, i haven't test it on linux as I do not have a linux machine on hand. It might take a few days for me to setup a linux machine.
Btw on OS X I keep getting a different opengl version from that reported by glGetString(GL_VERSION) and from the sf::ContextSetting return value.
here is the code i use to test for the GL_VERSION
////////////////////////////////////////////////////////////
// Headers
////////////////////////////////////////////////////////////
#include <cstdlib>
#include <cstdio>
#include <iostream>
#include <SFML/System.hpp>
#include <SFML/Window.hpp>
#include <SFML/OpenGL.hpp>
////////////////////////////////////////////////////////////
/// Entry point of application
///
/// \return Application exit code
///
////////////////////////////////////////////////////////////
void WindowFunction()
{
// Create the main window
sf::ContextSettings Settings;
Settings.DepthBits = 24; // Request a 24 bits depth buffer
Settings.StencilBits = 8; // Request a 8 bits stencil buffer
Settings.AntialiasingLevel = 0; // Request 2 levels of antialiasing
Settings.MajorVersion = 3;
Settings.MinorVersion = 3;
sf::Window App(sf::VideoMode(1024, 768, 32), "SFML OpenGL", sf::Style::Close, Settings);
Settings = App.GetSettings();
std::cout << "DepthBits " << Settings.DepthBits << "\n";
std::cout << "StencilBits " << Settings.StencilBits << "\n";
std::cout << "Antialiasing Level " << Settings.AntialiasingLevel << "\n";
std::cout << "Opengl Version " << Settings.MajorVersion << "." << Settings.MinorVersion << "\n";
std::cout << "Acutal Opengl Version " << glGetString(GL_VERSION) << "\n";
// Set color and depth clear value
glClearDepth(1.f);
glClearColor(0.176f, 0.196f, 0.667f, 0.0f);
// Enable Z-buffer read and write
glEnable(GL_DEPTH_TEST);
glDepthMask(GL_TRUE);
// Setup a perspective projection
glMatrixMode(GL_PROJECTION);
glLoadIdentity();
gluPerspective(90.f, 1.f, 1.f, 500.f);
// Start game loop
while (App.IsOpened())
{
// Process events
sf::Event Event;
while (App.GetEvent(Event))
{
// Close window : exit
if (Event.Type == sf::Event::Closed)
App.Close();
// Escape key : exit
if ((Event.Type == sf::Event::KeyPressed) && (Event.Key.Code == sf::Key::Escape))
App.Close();
// Resize event : adjust viewport
//if (Event.Type == sf::Event::Resized)
// glViewport(0, 0, Event.Size.Width, Event.Size.Height);
}
// Set the active window before using OpenGL commands
// It's useless here because active window is always the same,
// but don't forget it if you use multiple windows or controls
//App.SetActive(true);
// Clear color and depth buffer
//glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
// Finally, display rendered frame on screen
App.Display();
//App.SetActive(false);
}
}
int main()
{
WindowFunction();
std::cin.get();
//WindowFunction();
return EXIT_SUCCESS;
}
and the output I get is something of
DepthBits 24
StencilBits 8
Antialiasing Level 0
Opengl Version 3.3
Acutal Opengl Version 2.1 NVIDIA-1.6.26
2011-04-01 17:31:37.329 sfml2-test[1036:903] *** attempt to pop an unknown autorelease pool (0x10104fc00)
The version doesn't seems to match.
regards
-
The version doesn't seems to match.
This is a known "todo" ( https://github.com/SFML/SFML/issues#issue/9 ) :wink:
-
Following the call stack we end up in a virtual function call indeed, but it's not called from the constructor or destructor of the object itself.
And it's working on Windows, which is weird (would crash on every platform if it was an obvious pure virtual function call).
-
Following the call stack we end up in a virtual function call indeed, but it's not called from the constructor or destructor of the object itself.
But the 9th item is a ctor, no ? So in the 6th when we call MakeActive we call a pure virtual method.
And it's working on Windows, which is weird
Yes, that bug puzzle me a lot. The debugger say nothing really helpful... =/
-
Hiura:
This is a known "todo" ( https://github.com/SFML/SFML/issues#issue/9 )
This is nice :). I have decided to pause my porting of code to OS X until Lion comes out or until OS X can at least properly support opengl3.2 and its corresponding GLSL with at least version 150. :wink:
Btw if it does help, on windows the code is being compile in MSVC2010. I am not sure if MSVC might compile differently from gcc4.6. But so far, in my experience of porting my code to mac, gcc4.6 have always been a lot more stricter than in MSVC2010
regards
-
But the 9th item is a ctor, no ? So in the 6th when we call MakeActive we call a pure virtual method.
It's the constructor of another object ;)
A pure virtual function call happens only if you call the function from the constructor of the base class of the object that overrides it.
-
Ho, you're right! I didn't see the "priv::" thing.... And now my only hypothesis is gone... :?
if any of you guys have any idea...
-
Ho, you're right! I didn't see the "priv::" thing
And it's not even the same class name, there's GlContext and GlResource (this is a new class);)
I guess we should wait for the Linux test, and if it works then it's probably an OS X specific problem :P
-
now I feel really dumb :lol:
i'll take a nap I think....
-
maybe it's not very relevant because I ran the program in a virtual box on OS X but I get this error when the second window is created :
X Error of failed request: GLXBadContext
Major opcode of failed request: 155 (GLX)
Minor opcode of failed request: 3 (X_GLXCreateContext)
Serial number of failed request: 47
Current serial number in output stream: 52
Does anyone get something similar ?
I tried to get something more precise from ddd but didn't succeed.
-
Linux Tester here! Works flawlessly on last SVN build of SFML2.
-
OniLink10, that's a good hint! I've just tested the 'f3d212f' (before ATI fix) git version and it worked fine.
I'll investigate this later during next week (hopefully).
-
Tested on latest GIT version and it works flawlessly as well.
-
hi, i was just wondering if is there a fix or a workaround for this problem?
regards
-
This was fixed weeks ago. ;-)
-
cool thanks :)