Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: Applying a shader seem to cause a massive CPU jump  (Read 1653 times)

0 Members and 1 Guest are viewing this topic.

Sam42

  • Newbie
  • *
  • Posts: 16
    • View Profile
Applying a shader seem to cause a massive CPU jump
« on: August 26, 2011, 12:01:32 am »
I'm afraid my attempts at reducing this to a complete and minimal code have resulted in nothing useful, as the cpu jump is not evident. This is, I suppose, probably a signifier of where the problem lies, however I currently remain ignorant despite my efforts :(

Essentially i've been following the dynamic lighting tutorial found here: http://www.sfml-dev.org/wiki/fr/sources/lightmanager

I have a working implementation, and while the light is stationary the program uses maybe 1/2% cpu. If I apply the blur.sfx shader from the sfml examples folder, I attain the desired effect, but my cpu usage jumps to almost an entire core.

The lights are drawn to a window sized rendertexture, and the draw function  for the resulting rendertexture looks like this:

Code: [Select]
void LightManager::Draw(sf::RenderWindow &window)
{
    if (moved)
    {
        rSprite.SetTexture(rTex.GetTexture());
        rSprite.SetBlendMode(sf::Blend::Multiply);
    }

    window.Draw(rSprite, BlurEffect);
}


Where rTex is the rendertexture and rSprite is the corresponding sprite used to draw it. The shader is loaded and configured once in an Init() function, and Draw is the only call that it is involved in thereafter. The program is also FrameLimited to 120.

Computer wise i'm using an Intel T6500 cpu with Nvidia g105m in Ubuntu 64 bit using an up to date proprietary Nvidia driver. The problem is also apparent on the same computer in Win 7 64 bit after testing.

As i've not provided a minimal code I don't expect any concrete solutions, but I'm hoping for a few pointers as to why this might be happening. Any suggestions on where to look would be greatly appreciated.

Edit: I just tested the program on a friends laptop, using an ATI hd4300 series card and the cpu usage was exactly what i'd expect. The shader made no noticeable difference to cpu usage whether applied or not. I'm left fairly puzzled, as I'm fairly certain my computer with the nvidia g105m has a hardware implementation of pixel shaders.