Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: Bug with textures from tileset  (Read 3718 times)

0 Members and 1 Guest are viewing this topic.

Doodlemeat

  • Guest
Bug with textures from tileset
« on: January 31, 2014, 08:44:18 pm »
Hello.

I have built a parser for Tiled to use in my SFML 2.1 Game.
All of the tiles used comes from a tileset where every tile is 64x64.

I use sf::IntRect to get the correct tile position from the tileset.
This works very good until I stumbled upon a problem where my texture would gap 1 pixel down like this:

The blue one is the player with its origin in middle and he is also in the middle of a viewport. Those orange lines shows then and then and I dont really know what's the problem. But the things is, the orange tile lies under the blue tile on my spritesheet. I guess it has something to do with that.

Does someone have a similar problem? I really want this to work since we have and will put much effort into making this a cool game.

My code is here: http://pastebin.com/UXSGx52m

fallahn

  • Sr. Member
  • ****
  • Posts: 492
  • Buns.
    • View Profile
    • Trederia
Re: Bug with textures from tileset
« Reply #1 on: January 31, 2014, 09:20:19 pm »
This is a well known artifact due to how OpenGL rounds coordinates to the nearest pixel. Check this out to see how I handle it in my Tiled map loader.

AncientGrief

  • Newbie
  • *
  • Posts: 32
    • Yahoo Instant Messenger - what?
    • View Profile
Re: Bug with textures from tileset
« Reply #2 on: January 31, 2014, 09:25:20 pm »
This is a well known artifact due to how OpenGL rounds coordinates to the nearest pixel. Check this out to see how I handle it in my Tiled map loader.

Hi fallahn,

can you explain it a little more? What conditions have to be met to get this bug? Or can you provide a link to an explanation?
(Does this happen when going beyond a certain coordinate? I understand the solution, but not the exact cause for the problem)
« Last Edit: January 31, 2014, 09:27:51 pm by AncientGrief »
I grew up with C++...but then I met C#...got lazy...time to turn back to my one and only love!
My system:
Wintendo 10 with Intel i7 4770k
Palit GeForce GTX 980
16 GB RAM

fallahn

  • Sr. Member
  • ****
  • Posts: 492
  • Buns.
    • View Profile
    • Trederia
Re: Bug with textures from tileset
« Reply #3 on: January 31, 2014, 09:43:10 pm »
There are a few topics on the forum on the subject (which is where I got the fix from myself) if you search for the half-pixel trick. Basically when using floating point positioning for a view, or a view which scales the world coordinates to anything other than 1 pixel per unit then open gl has to round the data to the nearest texel to rasterise it. This means that sometimes when you have a coordinate of say 10.8 it gets rounded to 11 rather than 10. On something like a texture subrect, either in a sprite or a vertex array, the coordinates become 1 - 65 rather than 0 - 64. This then leads to 1 pixel of the neighbouring tile getting rendered when it shouldn't. Adding or subtracting half a pixel to the value will help opengl round to the correct value 99.9% of the time (although there will still be edge cases)

AncientGrief

  • Newbie
  • *
  • Posts: 32
    • Yahoo Instant Messenger - what?
    • View Profile
Re: Bug with textures from tileset
« Reply #4 on: January 31, 2014, 09:50:01 pm »
There are a few topics on the forum on the subject (which is where I got the fix from myself) if you search for the half-pixel trick. Basically when using floating point positioning for a view, or a view which scales the world coordinates to anything other than 1 pixel per unit then open gl has to round the data to the nearest texel to rasterise it. This means that sometimes when you have a coordinate of say 10.8 it gets rounded to 11 rather than 10. On something like a texture subrect, either in a sprite or a vertex array, the coordinates become 1 - 65 rather than 0 - 64. This then leads to 1 pixel of the neighbouring tile getting rendered when it shouldn't. Adding or subtracting half a pixel to the value will help opengl round to the correct value 99.9% of the time (although there will still be edge cases)

Thanks for the info :) But I wonder why should anyone use coordinates containing decimal positions for tile positioning (based on vertex arrays)?! Tiles (images) have integer dimensions oO (Especially @ Doodlemeat's problem)
I grew up with C++...but then I met C#...got lazy...time to turn back to my one and only love!
My system:
Wintendo 10 with Intel i7 4770k
Palit GeForce GTX 980
16 GB RAM

fallahn

  • Sr. Member
  • ****
  • Posts: 492
  • Buns.
    • View Profile
    • Trederia
Re: Bug with textures from tileset
« Reply #5 on: January 31, 2014, 10:20:01 pm »
A good question, indeed "traditional" 2d engines work on whole pixels (Andre LeMothe has written some interesting books on the subject) as that was all the technology allowed for. Sfml, much like all other modern libraries, is based on an api designed for rendering 3d graphics which takes advantage of modern hardware. The 2d effect is actually almost fake if you like, it is in fact a 3d quad mapped so that it appears flat on the screen, and all the images are textures mapped to it (I wrote a post which touches on this a while back). Because of this the underlying maths will be floating point. While this has many advantages this topic is a good example of one of the (few) drawbacks of using 3d for a 2d api.

AncientGrief

  • Newbie
  • *
  • Posts: 32
    • Yahoo Instant Messenger - what?
    • View Profile
Re: Bug with textures from tileset
« Reply #6 on: January 31, 2014, 10:41:06 pm »
The 2d effect is actually almost fake if you like, it is in fact a 3d quad mapped so that it appears flat on the screen, and all the images are textures mapped to it.

Yeah I read this somewhere :) But OpenGL shouldn't get in trouble with calculating the tile positions if the float coordinates only contain integers. There shouldn't be rounding issues?! Or am I wrong?! I don't know how the GPU handles this exactly.

And Doodlemeat: Are you using coordinates containing decimal points?!

Thanks for the arcticle link, will read it tomorrow, when i am sober ;D
I grew up with C++...but then I met C#...got lazy...time to turn back to my one and only love!
My system:
Wintendo 10 with Intel i7 4770k
Palit GeForce GTX 980
16 GB RAM

Doodlemeat

  • Guest
Re: Bug with textures from tileset
« Reply #7 on: January 31, 2014, 10:53:26 pm »
Im using coordinates as floats for the tiles. They actually inherit sf::Sprite so they does that default. I dont know how I can change it really.

fallahn I will take a look at your links. And I will post an update on the progress!

AncientGrief

  • Newbie
  • *
  • Posts: 32
    • Yahoo Instant Messenger - what?
    • View Profile
Re: Bug with textures from tileset
« Reply #8 on: January 31, 2014, 10:57:22 pm »
Im using coordinates as floats for the tiles. They actually inherit sf::Sprite so they does that default. I dont know how I can change it really.

Yeah float is normal, but i meant are you using coordinates like:
float x = 1.1.f
float y = 2.2.f
 

Maybe some calculation error that gives you decimal places?
« Last Edit: January 31, 2014, 11:00:24 pm by AncientGrief »
I grew up with C++...but then I met C#...got lazy...time to turn back to my one and only love!
My system:
Wintendo 10 with Intel i7 4770k
Palit GeForce GTX 980
16 GB RAM

fallahn

  • Sr. Member
  • ****
  • Posts: 492
  • Buns.
    • View Profile
    • Trederia
Re: Bug with textures from tileset
« Reply #9 on: January 31, 2014, 11:07:03 pm »
But OpenGL shouldn't get in trouble with calculating the tile positions if the float coordinates only contain integers.

Which is very true. Often rounding the position with floor() or ceil() will fix the artifacting, but using these every frame will have some (probably very small) impact on performance, which seems needless when OpenGL is going to do its own rounding regardless. It is ultimately the view which causes the problem, because when moving it you're unlikely to use an integral number: moveDistance * deltaTime rarely comes out whole because deltaTime is unlikely to be a whole number. The view is also often used to scale an image to the current screen resolution, for instance 800x600 pixel scene scaled to 1440x1080 has a scale of approximately 1.8 - not a whole number. So while you may use a tile of 32x32 or 64x64 when they are offset or scaled by a non-integer they themselves become a non-integer, and subject to rounding errors.

Doodlemeat

  • Guest
Re: Bug with textures from tileset
« Reply #10 on: January 31, 2014, 11:07:38 pm »
I dont know. I never change position of my tiles. That does the viewport handle. I just update the viewport. Should I use static_cast<int> or round on the tiles coordinates?

krzat

  • Full Member
  • ***
  • Posts: 107
    • View Profile
Re: Bug with textures from tileset
« Reply #11 on: February 01, 2014, 11:58:27 am »
I dont know. I never change position of my tiles. That does the viewport handle. I just update the viewport. Should I use static_cast<int> or round on the tiles coordinates?
That won't work. The simplest solution would be to shrink your source rectangles as  fallahn suggested. For Sprites you may write setTextureRect(sf::IntRect(1,1,62,62)) instead of (0,0,64,64).
SFML.Utils - useful extensions for SFML.Net

 

anything