Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: [SOLVED]Shader does not draw Tilemap Correctly  (Read 4112 times)

0 Members and 2 Guests are viewing this topic.

Chain

  • Newbie
  • *
  • Posts: 5
    • View Profile
[SOLVED]Shader does not draw Tilemap Correctly
« on: January 27, 2015, 10:00:31 pm »
I'm using a shader to draw my tilemap onto a texure with the size of the camera (window) and im using a texure with the size of the tilemap (in tiles) to let the shader know what tile to draw.
Each pixel in the tilemap data texure represents 1 tile and im using the rgba channels as layers (r = floor, g = walls) to save the TileID.
But somehow it does not draw the correct tiles can you please take a look at my shader, because I'm trying to fix this for days but nothing works.

I am storing the TileIDs of the diffrent tile Types inside the color. for example: r = 0, g = 1. This corresponds to the 0th tile of my spritesheet as the ground layer, and 1th tile of my spritesheet is ontop of it. My Spritesheet starts with index 0.

Layer Info:
3: -> Top most Layer -> A
2: -> B
1: -> G
0: Bottom most Layer -> R

I only use R and G at the moment, my drawOrder is from 0 to 3 where 0 is the bottom most layer.

The TileMapData in the position of the tilemap (ONLY R & G is used, the rest is ignored), color values 0-255 represent the TileTypeID:
RGBA(0,1,0,255);RGBA(2,3,0,255);RGBA(1,2,0,255)
RGBA(0,1,0,255);RGBA(2,3,0,255);RGBA(1,2,0,255)

When I give the Shaderfunction "vec4 GetDrawColorFromSpritesheet(int tileType)" a TileTypeID manually it grabs the correct Tile from the tilesheet.
The error must be inside the Shaderfunction "int GetTileTypeFromMapData(int layerID)" or one of the helper functions getting the coordinates but I cannot figure out where it is.

I provide Images and Code to help you visualize it. I'm very sorry for the wall of code :(.

Thank you for taking a look, it drives me mad.

My Spritesheet
(click to show/hide)

How it Looks when executed and all the tileIDs in Console:
(click to show/hide)

It should look like this:
(click to show/hide)

My Shader:
(click to show/hide)

My Main.cpp as reference:
(click to show/hide)

Its Still broken I just draw the tiles seperatly. Thanks anyway
« Last Edit: January 28, 2015, 11:56:30 am by Chain »

Gambit

  • Sr. Member
  • ****
  • Posts: 283
    • View Profile
Re: Shader does not draw Tilemap Correctly
« Reply #1 on: January 27, 2015, 11:09:31 pm »
What does "draws only nonsense" mean? What are you expecting to happen with your code? Also if you didnt notice, you have different sized sprites in your sheet, you might want to scale them to the same size.

Chain

  • Newbie
  • *
  • Posts: 5
    • View Profile
Re: Shader does not draw Tilemap Correctly
« Reply #2 on: January 27, 2015, 11:40:26 pm »
What does "draws only nonsense" mean? What are you expecting to happen with your code? Also if you didnt notice, you have different sized sprites in your sheet, you might want to scale them to the same size.

draws only nonsense was badly phrased, I ment it does not draw the correct tiles. Also my spritesheet is 4x32 in the X and 32 in the Y. Each tile is 32x32. The first Tile in the sheet is just a 32x32 transparent.
I expected to draw the 3x2 Tiles sized tilemap with the correct tiles.

It should look like this:


But it looks like this:
« Last Edit: January 27, 2015, 11:50:30 pm by Chain »

Hapax

  • Hero Member
  • *****
  • Posts: 3379
  • My number of posts is shown in hexadecimal.
    • View Profile
    • Links
Re: Shader does not draw Tilemap Correctly
« Reply #3 on: January 27, 2015, 11:48:38 pm »
I expected to draw the 3x2 Tiles sized tilemap with the correct tiles.
You have to understand that "correct tiles" means very little without fully studying your code.
At first glance at the screenshot, it looks like some colour represents some tile, but as far as we know, it's showing the right tile for the colours shows.
Selba Ward -SFML drawables
Cheese Map -Drawable Layered Tile Map
Kairos -Timing Library
Grambol
 *Hapaxia Links*

Chain

  • Newbie
  • *
  • Posts: 5
    • View Profile
Re: Shader does not draw Tilemap Correctly
« Reply #4 on: January 27, 2015, 11:59:39 pm »
I expected to draw the 3x2 Tiles sized tilemap with the correct tiles.
You have to understand that "correct tiles" means very little without fully studying your code.
At first glance at the screenshot, it looks like some colour represents some tile, but as far as we know, it's showing the right tile for the colours shows.

My bad I try to be more precise.

I am storing the TileIDs of the diffrent tile Types inside the color. for example: r = 0, g = 1. This corresponds to the 0th tile of my spritesheet as the ground layer, and 1th tile of my spritesheet is ontop of it. My Spritesheet starts with index 0.

Layer Info:
3: -> Top most Layer -> A
2: -> B
1: -> G
0: Bottom most Layer -> R

I only use R and G at the moment, my drawOrder is from 0 to 3 where 0 is the bottom most layer.

The TileMapData in the position of the tilemap (ONLY R & G is used, the rest is ignored), color values 0-255 represent the TileTypeID:
RGBA(0,1,0,255);RGBA(2,3,0,255);RGBA(1,2,0,255)
RGBA(0,1,0,255);RGBA(2,3,0,255);RGBA(1,2,0,255)

I hope this will be more helpfull.
SFML really needs to allow to pass int[] to the shader, then this kind of nonsense can be avoided :/.
« Last Edit: January 28, 2015, 12:02:22 am by Chain »

Gambit

  • Sr. Member
  • ****
  • Posts: 283
    • View Profile
Re: Shader does not draw Tilemap Correctly
« Reply #5 on: January 28, 2015, 12:13:18 am »
I still dont get why you are using RGBA for your positioning (No offence, but it sounds silly), especially since SFML has sf::Vector3 (With specializations). Using it would allow you to use x and y as your x and y, and z as your layer number where 0 is the bottom and n is the top.

Furthermore, is there any reason why you have to use a shader? Keeping it in C++ would probably be a good idea, not to mention you are kinda misusing the shader.

As for your problem, it's like Hapax said, without giving your code a thorough look over, its hard to pinpoint exactly what the problem is, and given your RGBA system thing, its still hard (For me at least) to understand how you are using it, therefore I cant give you any code solutions.

Chain

  • Newbie
  • *
  • Posts: 5
    • View Profile
Re: Shader does not draw Tilemap Correctly
« Reply #6 on: January 28, 2015, 12:34:08 am »
I still dont get why you are using RGBA for your positioning (No offence, but it sounds silly), especially since SFML has sf::Vector3 (With specializations). Using it would allow you to use x and y as your x and y, and z as your layer number where 0 is the bottom and n is the top.

Furthermore, is there any reason why you have to use a shader? Keeping it in C++ would probably be a good idea, not to mention you are kinda misusing the shader.

As for your problem, it's like Hapax said, without giving your code a thorough look over, its hard to pinpoint exactly what the problem is, and given your RGBA system thing, its still hard (For me at least) to understand how you are using it, therefore I cant give you any code solutions.

I'm doing the RGBA thing because I cannot give the shader my TileMapData directly. I use one Quad with the size of the Camera wich is currently the window to draw my map on it. I could use single Tiles and draw them, I thought I could save some processing time by doing it that way and being able to color the Tiles as well as using alphablending. I thought it would be easier with a shader, sadly I got no experience with shaders or OpenGL as you probably see by the mess of a shader.
The single tile version is my fallback when nothing works.

As for the Problem, It is hard for me to explain. I am basically creating a Texture wich has the TileData as Color information. Every Pixel in the TileDataTexture corresponds to one Tile in the Tilemap at the same position. For example: TileMap is 3 Tile wide and 2 Tiles High, then the TileDataTexture is 3 pixel wide and 2 pixel high. And every color of every pixel has the TileType ID stored in 0-255 format.

As I said in the first post I think the error is in the "int GetTileTypeFromMapData(int layerID)" function or one of the two "Coord" helper functions.
Because I have to calculate the Tile the shader tries to currently draw by the Camera Position and the Pixel he is currently on. Then get the corresponding tile Data from the "sampler2D tileMap" wich has one pixel per Tile with the tileinformation. And then get the pixel to draw from the "sampler2D tileSheet" wich is my sprite sheet.
Getting the correct Tilecolor from the Spritesheet by SpriteSheetTileIndex (0=Air (transparent), 1=Wall, 2=stone, 3=water) works.
But he seems to fail to get the correct pixel from the "sampler2D tileMap" or the wrong color information wich will be reversed to a SpriteSheetTileIndex by multiplying the color with 255.
« Last Edit: January 28, 2015, 12:40:07 am by Chain »

Gambit

  • Sr. Member
  • ****
  • Posts: 283
    • View Profile
Re: Shader does not draw Tilemap Correctly
« Reply #7 on: January 28, 2015, 12:41:21 am »
sadly I got no experience with shaders or OpenGL as you probably see by the mess of a shader.

Then why use shaders?

You dont "save" time by using a shader, it just defers the computations to the GPU. You should step through your code with a debugger. Unless you are compiling from the command line (Which I highly doubt) almost every IDE (XCode, Code::Blocks, Visual Studio) have debuggers. Step through your code and see whats happening.

Chain

  • Newbie
  • *
  • Posts: 5
    • View Profile
Re: Shader does not draw Tilemap Correctly
« Reply #8 on: January 28, 2015, 12:46:54 am »
sadly I got no experience with shaders or OpenGL as you probably see by the mess of a shader.

Then why use shaders?

You dont "save" time by using a shader, it just defers the computations to the GPU. You should step through your code with a debugger. Unless you are compiling from the command line (Which I highly doubt) almost every IDE (XCode, Code::Blocks, Visual Studio) have debuggers. Step through your code and see whats happening.

Because without trying stuff and learning from mistakes noone would know anything.

Regarding Debugging, my C++ code is fine and works like expected. I am using Visual Studio 2013 and not a commandline you are correct.

Hapax

  • Hero Member
  • *****
  • Posts: 3379
  • My number of posts is shown in hexadecimal.
    • View Profile
    • Links
Re: [SOLVED]Shader does not draw Tilemap Correctly
« Reply #9 on: January 29, 2015, 01:23:22 am »
Why is one subtracted here?
    vec2 textureCoords = vec2(tileMapTileCoord.x - 1 / tileMapDimensions.x, tileMapTileCoord.y - 1 / tileMapDimensions.y);
Does it work like this?
    vec2 textureCoords = vec2(tileMapTileCoord.x / tileMapDimensions.x, tileMapTileCoord.y / tileMapDimensions.y);
Selba Ward -SFML drawables
Cheese Map -Drawable Layered Tile Map
Kairos -Timing Library
Grambol
 *Hapaxia Links*