-
Hi to the SFML community. Thanks Laurent for giving us this invaluable library ;D
Having recently decided to learn OpenGL I have been attempting to use SFML as a windowing system for 3D rendering. So far I have been using windowing libraries like glfw3, freeglut which work fine for simple projects but lack the extra functionality of SFML.
I have followed the tutorial: http://www.sfml-dev.org/tutorials/2.2/window-opengl.php (http://www.sfml-dev.org/tutorials/2.2/window-opengl.php). After this I reviewed the example OpenGL + SFML project found in the SFML source. Now as mentioned I am new to OpenGL so was suprised when I found calls like:
// Enable position and texture coordinates vertex components
glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);
glVertexPointer(3, GL_FLOAT, 5 * sizeof(GLfloat), cube);
glTexCoordPointer(2, GL_FLOAT, 5 * sizeof(GLfloat), cube + 3);
glMatrixMode(GL_MODELVIEW);
glLoadIdentity();
glTranslatef(x, y, -100.f);
glRotatef(clock.getElapsedTime().asSeconds() * 50.f, 1.f, 0.f, 0.f);
glRotatef(clock.getElapsedTime().asSeconds() * 30.f, 0.f, 1.f, 0.f);
glRotatef(clock.getElapsedTime().asSeconds() * 90.f, 0.f, 0.f, 1.f);
These don't seem to match the function calls used in the book I am reading: The OpenGL Programming Guide 8th ed. Which covers OpenGL 4.3. I assume these functions belong to older versions of OpenGL? Additionally the sample code doesn't, to my eyes, make use of shaders. From my understanding these are core to the workings of modern OpenGL?
Anyway here is my code so far:
////////////////////////////////////////////////////////////
// Headers
////////////////////////////////////////////////////////////
#include <iostream>
#include <cstdio>
#include <GL/glew.h>
#include <GL/glu.h>
#include <glm/glm.hpp>
#include <SFML/Graphics.hpp>
#include <SFML/OpenGL.hpp>
#include "LoadShaders.h"
////////////////////////////////////////////////////////////
/// Entry point of application
///
/// \return Application exit code
///
////////////////////////////////////////////////////////////
int main()
{
// Request a 32-bits depth buffer when creating the window
sf::ContextSettings contextSettings;
contextSettings.depthBits = 32;
contextSettings.minorVersion = 3;
contextSettings.majorVersion = 3;
// Create the main window
sf::RenderWindow window(sf::VideoMode(800, 600), "SFML graphics with OpenGL", sf::Style::Default, contextSettings);
window.setVerticalSyncEnabled(true);
window.setActive();
// Initialize GLEW
glewExperimental = true; // Needed for core profile
if (glewInit() != GLEW_OK) {
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}
// Configure the viewport (the same size as the window)
glViewport(0, 0, window.getSize().x, window.getSize().y);
// Initialize GLEW
glewExperimental = true; // Needed for core profile
if (glewInit() != GLEW_OK) {
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}
// Dark blue background
glClearColor(0.0f, 0.0f, 0.4f, 0.0f);
GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
ShaderInfo shader_info[] =
{
{ GL_VERTEX_SHADER, "shaders/SimpleVertexShader.vertexshader" },
{ GL_FRAGMENT_SHADER, "shaders/SimpleFragmentShader.fragmentshader" },
{ GL_NONE, NULL }
};
// Create and compile our GLSL program from the shaders
GLuint programID = LoadShaders(shader_info);
static const GLfloat g_vertex_buffer_data[] =
{
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};
GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
// Start game loop
while (window.isOpen())
{
// Clear the depth buffer
glClear(GL_COLOR_BUFFER_BIT);
// Use our shader
glUseProgram(programID);
// 1rst attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0, // attribute 0. No particular reason for 0, but must match the layout in the shader.
3, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // stride
(void*)0 // array buffer offset
);
// Draw the triangle !
glDrawArrays(GL_TRIANGLES, 0, 3); // 3 indices starting at 0 -> 1 triangle
glDisableVertexAttribArray(0);
// Finally, display the rendered frame on screen
window.display();
// Process events
sf::Event event;
while (window.pollEvent(event))
{
// Close window: exit
if (event.type == sf::Event::Closed)
window.close();
// Escape key: exit
if ((event.type == sf::Event::KeyPressed) && (event.key.code == sf::Keyboard::Escape))
window.close();
// Adjust the viewport when the window is resized
if (event.type == sf::Event::Resized)
glViewport(0, 0, event.size.width, event.size.height);
}
}
glDeleteVertexArrays(1, &VertexArrayID);
glDeleteBuffers(1, &vertexbuffer);
glDeleteProgram(programID);
return EXIT_SUCCESS;
}
This code compiles and runs fine, but does not render the triangle as expected. Now when I swap the SFML window for a glfw3 window keeping essentially the same code apart from a few extra calls to glfw3 the triangle is rendered. Can anyone see where I am going wrong? If you require more information please let me know.
Thanks,
Haize
-
Hi, and welcome! :)
I don't suppose many people would be able to guess how you erred if they don't know what the symptom of that error is.
does not render the triangle as expected
What was the expected result and what was the actual result? Screenshots may help too.
-
I have followed the tutorial: http://www.sfml-dev.org/tutorials/2.2/window-opengl.php (http://www.sfml-dev.org/tutorials/2.2/window-opengl.php). After this I reviewed the example OpenGL + SFML project found in the SFML source. Now as mentioned I am new to OpenGL so was suprised when I found calls like:
Yes... this comes from the long history of SFML. When Laurent first started development of SFML, "modern" OpenGL as we call it today was still in its infancy, probably still being referred to as "Longs Peak". A lot of hardware didn't support the newer API, and to appeal to a wider audience at that time, SFML had to be written on top of the legacy API. Times change, and now, we struggle to find hardware that doesn't support the modern API. However, we still want to keep support for the older hardware that SFML has supported since the beginning for as long as is deemed worthwhile.
SFML might one day support a backend that doesn't have to rely on the legacy API on capable hardware, and work is already under way to clean up a lot of the archaic stuff in the SFML code base.
These don't seem to match the function calls used in the book I am reading: The OpenGL Programming Guide 8th ed. Which covers OpenGL 4.3. I assume these functions belong to older versions of OpenGL? Additionally the sample code doesn't, to my eyes, make use of shaders. From my understanding these are core to the workings of modern OpenGL?
Yes... those functions are from the legacy API. If you are developing for newer (or... let's say non-ancient) hardware and don't have to worry about getting your application to run on systems that are more than 7 years old, then you can safely stick to the modern API as is used in your book. The modern API, as you said is all shader based and the fixed function pipeline was completely thrown out along with its horrible matrix stack ;).
You shouldn't let the SFML example code confuse you. Just create an SFML window, set it active and you are good to go just like you would with those other libraries you know.
If you don't intend to use any sfml-graphics features (sprites, shapes, etc.) then you can completely ignore that module all together and use sfml-window by itself. sfml-window provides roughly the same functionality as GLFW (obviously with a different API) and is perfect if you just want to handle all the OpenGL stuff yourself.
To do this, simply replace your sf::RenderWindow with a simple sf::Window. It is a barebones target for OpenGL operations. You can also replace <SFML/Graphics.hpp> with <SFML/Window.hpp> in this case. Including <SFML/OpenGL.hpp> is also unnecessary since you already included the GLEW header before that.
This code compiles and runs fine, but does not render the triangle as expected. Now when I swap the SFML window for a glfw3 window keeping essentially the same code apart from a few extra calls to glfw3 the triangle is rendered. Can anyone see where I am going wrong? If you require more information please let me know.
Yeah... this is one of the things newcomers have to get to know about interaction between sfml-graphics and OpenGL ;). When you create an sf::RenderWindow, it does... certain things... that prepare it for sfml-graphics rendering. This interferes with your OpenGL code and causes your triangle to not be drawn.
Like I said, restrict yourself to the sfml-window module (sf::Window) and you should be good and all the stuff in the book should "just work".
-
You shouldn't let the SFML example code confuse you. Just create an SFML window, set it active and you are good to go just like you would with those other libraries you know.
If you don't intend to use any sfml-graphics features (sprites, shapes, etc.) then you can completely ignore that module all together and use sfml-window by itself. sfml-window provides roughly the same functionality as GLFW (obviously with a different API) and is perfect if you just want to handle all the OpenGL stuff yourself.
To do this, simply replace your sf::RenderWindow with a simple sf::Window. It is a barebones target for OpenGL operations. You can also replace <SFML/Graphics.hpp> with <SFML/Window.hpp> in this case. Including <SFML/OpenGL.hpp> is also unnecessary since you already included the GLEW header before that.
This worked. Thanks! As you instructed I removed the graphics library and replaced the header: <SFML/Graphics.hpp> with <SFML/Window.hpp> and sf::RenderWindow with sf::Window and now I get my triangle.
Like I said, restrict yourself to the sfml-window module (sf::Window) and you should be good and all the stuff in the book should "just work".
So currently the exclusion of the Graphics module is mandatory when using modern OpenGL. This is a pity as having the classes sf::Texture, sf::Sprite and sf::Text would be beneficial. Guess I will just need to learn to load textures myself ;)
SFML might one day support a backend that doesn't have to rely on the legacy API on capable hardware, and work is already under way to clean up a lot of the archaic stuff in the SFML code base.
Watch this space I guess :D
Thanks again for the solution,
Haize
-
So currently the exclusion of the Graphics module is mandatory when using modern OpenGL. This is a pity as having the classes sf::Texture, sf::Sprite and sf::Text would be beneficial. Guess I will just need to learn to load textures myself ;)
It's not mandatory, it will just make your life easier since you won't have to deal with SFML changing any states etc. ;)
-
And you will see implementing sf::Texture isn't that hard. sf::Text however is tricky ;D (Freetype :-X)
AlexAUT
-
It's not mandatory, it will just make your life easier since you won't have to deal with SFML changing any states etc. ;)
So there is a way to render in a sf::Window with OpenGL 4+ while using the Graphics module? Is it more in depth or something?
Its just that if I link the Graphics and System module then recompile the triangle no longer renders. I have to add the System module or I get this error:undefined reference to symbol '_ZN2sf6StringC1EPKcRKSt6locale'
So for me it follows that linking the Graphics module is not possible at this time when using the new OpenGL pipeline, unless I am missing something? Additionally when I link the Window and System module the triangle again doesn't render. Rendering only occurs when I am only linking the Window module, so at the moment I am missing the majority of SFML functionality. If I can only use the Window Module I may as well just use GLFW, please enlighten me. :-[
Thanks,
Haize
-
I don't know what I've done now but it now won't compile unless I link the System Module or I get the error
undefined reference to symbol '_ZN2sf6StringC1EPKcRKSt6locale'
Which didn't seem to be the case before when I only linked the Window module and it was rendering. Additionally the triangle is no longer being rendered as the included screenshot shows.
Here is my code:#include <iostream>
#include <cstdio>
#include <GL/glew.h>
#include <GL/glu.h>
#include <glm/glm.hpp>
#include <SFML/Window.hpp>
#include "LoadShaders.h"
////////////////////////////////////////////////////////////
/// Entry point of application
///
/// \return Application exit code
///
////////////////////////////////////////////////////////////
int main()
{
// Request a 32-bits depth buffer when creating the window
sf::ContextSettings contextSettings;
contextSettings.depthBits = 32;
contextSettings.minorVersion = 3;
contextSettings.majorVersion = 3;
// Create the main window
sf::Window window(sf::VideoMode(800, 600), "SFML graphics with OpenGL", sf::Style::Default, contextSettings);
window.setVerticalSyncEnabled(true);
window.setActive();
// Configure the viewport (the same size as the window)
glViewport(0, 0, window.getSize().x, window.getSize().y);
// Initialize GLEW
glewExperimental = true; // Needed for core profile
if (glewInit() != GLEW_OK) {
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}
// Dark blue background
glClearColor(0.0f, 0.0f, 0.4f, 0.0f);
GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
ShaderInfo shader_info[] =
{
{ GL_VERTEX_SHADER, "shaders/SimpleVertexShader.vertexshader" },
{ GL_FRAGMENT_SHADER, "shaders/SimpleFragmentShader.fragmentshader" },
{ GL_NONE, NULL }
};
// Create and compile our GLSL program from the shaders
GLuint programID = LoadShaders(shader_info);
static const GLfloat g_vertex_buffer_data[] =
{
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};
GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
// Start game loop
while (window.isOpen())
{
// Clear the depth buffer
glClear(GL_COLOR_BUFFER_BIT);
// Use our shader
glUseProgram(programID);
// 1rst attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0, // attribute 0. No particular reason for 0, but must match the layout in the shader.
3, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // stride
(void*)0 // array buffer offset
);
// Draw the triangle !
glDrawArrays(GL_TRIANGLES, 0, 3); // 3 indices starting at 0 -> 1 triangle
glDisableVertexAttribArray(0);
// Finally, display the rendered frame on screen
window.display();
// Process events
sf::Event event;
while (window.pollEvent(event))
{
// Close window: exit
if (event.type == sf::Event::Closed)
window.close();
// Escape key: exit
if ((event.type == sf::Event::KeyPressed) && (event.key.code == sf::Keyboard::Escape))
window.close();
// Adjust the viewport when the window is resized
if (event.type == sf::Event::Resized)
glViewport(0, 0, event.size.width, event.size.height);
}
}
glDeleteVertexArrays(1, &VertexArrayID);
glDeleteBuffers(1, &vertexbuffer);
glDeleteProgram(programID);
return EXIT_SUCCESS;
}
And my linking in code blocks
Global:
GLEW
GLU
GL
Debug:
sfml-window-d
sfml-system-d
Can anybody see any problems here? I should have saved that project when it was working :'(
-
I don't know what I've done now but it now won't compile unless I link the System Module or I get the error undefined reference to symbol '_ZN2sf6StringC1EPKcRKSt6locale'
You always have to link the system module. The graphics, window, network and audio module all depend on it. ;)
So there is a way to render in a sf::Window with OpenGL 4+ while using the Graphics module? Is it more in depth or something?
You'll have to make sure to save all your OpenGL states then draw something with SFML's graphics module and then restore them. To do this properly you'll have to know enough about OpenGL and SFML's inner working. Ah and you have to make sure the view is set properly as well.
I've never worked with OpenGL myself, so I can't give you any detailed description.
-
You always have to link the system module. The graphics, window, network and audio module all depend on it. ;)
First of all Doh! :-[ I remember reading that some time ago but wasn't thinking, thanks.
Now looking at the first screen shot I posted earlier I realise now that I was being an idiot, as the window's title "Tutorial 02 - Red triangle" indicates that I had compiled the GLFW3 version and not the version using SFML.
So basically when I posted:
This worked. Thanks! As you instructed I removed the graphics library and replaced the header: <SFML/Graphics.hpp> with <SFML/Window.hpp> and sf::RenderWindow with sf::Window and now I get my triangle.
I was mistaken.
I am still able to compile the program but the triangle still does not render. Again when I used GLFW for the function calls the triangle renders fine. I'm at a bit of a loss here. Starting to think its best to just wait for updates to SFML before I can use it for OpenGL. Which is a shame. :'(
Anyone have any idea from my previous source post what I am doing wrong or is it just not possible at the moment to use the modern OpenGL (shaders etc.) in an sf::Window?
Thanks,
Haize
-
I am still able to compile the program but the triangle still does not render. Again when I used GLFW for the function calls the triangle renders fine.
You could post the GLFW code here too so we can compare easier. If there is any difference in behaviour, it should come from the lines that you changed when going from GLFW to SFML.
I'm at a bit of a loss here. Starting to think its best to just wait for updates to SFML before I can use it for OpenGL. Which is a shame. :'(
This is never a good idea. If people don't report issues so they can get fixed, you can't expect anything to change for the better ;). For all we know, you could still be misunderstanding something we all assume you understand right.
Anyone have any idea from my previous source post what I am doing wrong or is it just not possible at the moment to use the modern OpenGL (shaders etc.) in an sf::Window?
If you can post everything that is required for me to reproduce your application (all source files and shaders should be enough) then I can try to see what the problem could be. I use SFML with modern OpenGL all the time, and I haven't had any major issues so far (well... there are issues, but I know how to handle them :P).
-
This is never a good idea. If people don't report issues so they can get fixed, you can't expect anything to change for the better ;). For all we know, you could still be misunderstanding something we all assume you understand right.
Wise words ;D.
Heres the GLFW code that renders:
////////////////////////////////////////////////////////////
// Headers
////////////////////////////////////////////////////////////
#include <iostream>
#include <cstdio>
#include <GL/glew.h>
#include <GLFW/glfw3.h>
#include <GL/glu.h>
#include <glm/glm.hpp>
#include "LoadShaders.h"
GLFWwindow* window;
////////////////////////////////////////////////////////////
/// Entry point of application
///
/// \return Application exit code
///
////////////////////////////////////////////////////////////
int main()
{
// Initialise GLFW
if( !glfwInit() )
{
fprintf( stderr, "Failed to initialize GLFW\n" );
return -1;
}
glfwWindowHint(GLFW_SAMPLES, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
// Open a window and create its OpenGL context
window = glfwCreateWindow( 1024, 768, "Tutorial 02 - Red triangle", NULL, NULL);
if( window == NULL ){
fprintf( stderr, "Failed to open GLFW window. If you have an Intel GPU, they are not 3.3 compatible. Try the 2.1 version of the tutorials.\n" );
glfwTerminate();
return -1;
}
glfwMakeContextCurrent(window);
// Initialize GLEW
glewExperimental = true; // Needed for core profile
if (glewInit() != GLEW_OK) {
fprintf(stderr, "Failed to initialize GLEW\n");
return -1;
}
// Dark blue background
glClearColor(0.0f, 0.0f, 0.4f, 0.0f);
GLuint VertexArrayID;
glGenVertexArrays(1, &VertexArrayID);
glBindVertexArray(VertexArrayID);
ShaderInfo shader_info[] =
{
{ GL_VERTEX_SHADER, "shaders/SimpleVertexShader.vertexshader" },
{ GL_FRAGMENT_SHADER, "shaders/SimpleFragmentShader.fragmentshader" },
{ GL_NONE, NULL }
};
// Create and compile our GLSL program from the shaders
GLuint programID = LoadShaders(shader_info);
static const GLfloat g_vertex_buffer_data[] =
{
-1.0f, -1.0f, 0.0f,
1.0f, -1.0f, 0.0f,
0.0f, 1.0f, 0.0f,
};
GLuint vertexbuffer;
glGenBuffers(1, &vertexbuffer);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glBufferData(GL_ARRAY_BUFFER, sizeof(g_vertex_buffer_data), g_vertex_buffer_data, GL_STATIC_DRAW);
// Start game loop
do
{
// Clear the depth buffer
glClear(GL_COLOR_BUFFER_BIT);
// Use our shader
glUseProgram(programID);
// 1rst attribute buffer : vertices
glEnableVertexAttribArray(0);
glBindBuffer(GL_ARRAY_BUFFER, vertexbuffer);
glVertexAttribPointer(
0, // attribute 0. No particular reason for 0, but must match the layout in the shader.
3, // size
GL_FLOAT, // type
GL_FALSE, // normalized?
0, // stride
(void*)0 // array buffer offset
);
// Draw the triangle !
glDrawArrays(GL_TRIANGLES, 0, 3); // 3 indices starting at 0 -> 1 triangle
glDisableVertexAttribArray(0);
glfwSwapBuffers(window);
glfwPollEvents();
} while(glfwGetKey(window, GLFW_KEY_ESCAPE ) != GLFW_PRESS &&
glfwWindowShouldClose(window) == 0);
glDeleteVertexArrays(1, &VertexArrayID);
glDeleteBuffers(1, &vertexbuffer);
glDeleteProgram(programID);
glfwTerminate();
return EXIT_SUCCESS;
}
Code blocks linking:
Global:
GLEW
glfw
GL
X11
Xxf86vm
Xi
pthread
Xrandr
Debug:
sfml-window-d
sfml-system-d
If you can post everything that is required for me to reproduce your application (all source files and shaders should be enough) then I can try to see what the problem could be. I use SFML with modern OpenGL all the time, and I haven't had any major issues so far (well... there are issues, but I know how to handle them :P).
I have attached two code block projects "OpenGL" uses GLFW and "OpenGL-SFML" uses SFML (obviously). Inside you will find directories "src" and "shaders". I hope these help. :)
Thanks,
Haize
-
Maybe its something to do with this code?:
glfwWindowHint(GLFW_SAMPLES, 4);
glfwWindowHint(GLFW_CONTEXT_VERSION_MAJOR, 3);
glfwWindowHint(GLFW_CONTEXT_VERSION_MINOR, 3);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
At the moment I am initialising my SFML version with:
sf::ContextSettings contextSettings;
contextSettings.depthBits = 32;
contextSettings.stencilBits = 32;
contextSettings.antialiasingLevel = 1;
contextSettings.minorVersion = 3;
contextSettings.majorVersion = 3;
// Create the main window
sf::Window window(sf::VideoMode(800, 600)
, "SFML graphics with OpenGL"
, sf::Style::Default, contextSettings);
glfwWindowHint(GLFW_SAMPLES, 4);
glfwWindowHint(GLFW_OPENGL_PROFILE, GLFW_OPENGL_CORE_PROFILE);
Not sure what the equivalent is for SFML. :-\
-
What graphics hardware and driver are you using? Mesa3D has this weird thing about not wanting to support compatibility contexts, and currently (SFML 2.2) you can't force SFML to create a core context like you can in GLFW. This will change soon hopefully :P.
EDIT: On Windows, your code runs fine. So I am assuming it is a core context issue specific to Linux until gl_dev_new gets merged.
-
What graphics hardware and driver are you using?
I have a combination of an integrated Intel chip and a Nvidia Optimus GT 620M.
This I discovered was my problem. Running the command: optirun
Before starting code blocks seems to be the solution, with the triangle now being rendered. This command comes from the bumblebee library, more at the bottom. I now get some error messages associated with Xserver after the window closes but that's a matter for another forum as I assume its an issue with my graphics drivers.
you can't force SFML to create a core context like you can in GLFW. This will change soon hopefully :P.
That might explain why GLFW works without prime/primus where as SFML requires it? (using OpenGL 3+)
EDIT: On Windows, your code runs fine. So I am assuming it is a core context issue specific to Linux until gl_dev_new gets merged.
I guess that's the price I will have to pay for developing on Linux. :(
Mesa3D has this weird thing about not wanting to support compatibility contexts..
I'll have a look into this. Might learn something ;D
Thanks for all the help,
Haize
EDIT: For other poor souls like me with an Nvidia Optimus card please look into using the bumblebee graphics package as from my experience it is by far the most stable. You do have the inconvenience of needing to set which programs use your Nvidia card and which do not, but this is minor in comparison to your desktop freezing every 5 minutes (in my experience anyway).
Bumblebee website:
http://bumblebee-project.org/ (http://bumblebee-project.org/)
Useful tutorial:
http://askubuntu.com/questions/452556/how-to-set-up-nvidia-optimus-bumblebee-in-14-04 (http://askubuntu.com/questions/452556/how-to-set-up-nvidia-optimus-bumblebee-in-14-04)
-
Linux OpenGL development only gets annoying when you are using the "free" drivers. If you are using the vendor drivers, I find it can be easier to develop on Linux than on Windows sometimes. In this case, you aren't using Mesa (the free driver) so compatibility contexts should work, and so should SFML 2.2 but only if you run it on the discrete GPU as you found out. When using SFML, it is always recommended to run it using discrete GPUs, there have been many issues with IGPs in the past but slowly they are being fixed.
-
Thought I should add that after my success with using bumblebee I attempted to use the Graphics Module again.
Guess what it worked ;D.
So no I have all the functionality of SFML which is amazing! 8)
I also removed GLU from linking. In the SFML OpenGL tutorial it mentions you may need to link it but seems to work without it.
-
I also removed GLU from linking. In the SFML OpenGL tutorial it mentions you may need to link it but seems to work without it.
Good... because it (GLU support) will probably get removed sometime soon ;).
-
EDIT: For other poor souls like me with an Nvidia Optimus card please look into using the bumblebee graphics package as from my experience it is by far the most stable. You do have the inconvenience of needing to set which programs use your Nvidia card and which do not, but this is minor in comparison to your desktop freezing every 5 minutes (in my experience anyway).
Seems I was wrong again. I found that after closer inspection the bumblebee drivers were causing errors with X Server. I decided to give the Nvidia drivers a go again and found this solved the errors with the scene rendered as expected and no need for manually executing a program with the command: optirun
I followed these steps:
*NOTE* Make sure you have the nouveau driver installed so you can use it after purging Nvidia drivers.
1. First select the nouveau driver from Software Settings -> Software and Updates -> Additional Drivers.
Now purge all Nvidia and bumblebee driver/packages:
sudo apt-get purge bumblebee*
sudo apt-get pruge nvidia-*
2. Reboot
3. After reboot reinstall the nvidia-331, nvidia-prime and nvidia-settings drivers/packages.
On Ubuntu 14.04 you can do this easily using the top right menu. Navigate to "Software Settings" -> "Software and Updates" -> "Additional Drivers" and select Nvidia 331-113. This will install the listed packages. Other distros can just use repositories I guess.
4. Reboot Again.
5. After reboot you should be able to run the command:
nvidia-settings
This will bring up the settings for you Nvidia GPU, you may need to change the performance settings under "PRIME profiles" and here you set the GPU. This may take logging in and out to take effect.
6. From here everything should work fine, well it is for me so far... :-\
I found this guide useful:
https://afterhourscoding.wordpress.com/2014/04/30/dell-l502x-optimus-support-on-ubuntu-14-04/
Haize