Welcome, Guest. Please login or register. Did you miss your activation email?

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - DragonDePlatino

Pages: [1]
1
C / Re: error: invalid use of undefined type 'struct sfTexture'
« on: October 02, 2016, 03:43:10 pm »
Ah...pointers to pointers! I'm new to C so this is something I have not encountered yet. Thanks for the help.

2
C / [SOLVED] error: invalid use of undefined type 'struct sfTexture'
« on: October 02, 2016, 04:37:50 am »
Hi! I just picked up C and CSFML and I'm having some issues loading textures. My compiler is telling me that sfTexture is an incomplete type. The error in question:

Code: [Select]
||=== Build: Release in Crucis (compiler: GNU GCC Compiler) ===|
C:\Users\Platino\Documents\C\Crucis\src\window.c||In function 'loadTextures':|
C:\Users\Platino\Documents\C\Crucis\src\window.c|113|error: invalid use of undefined type 'struct sfTexture'|
C:\Users\Platino\Documents\C\Crucis\src\window.c|113|error: dereferencing pointer to incomplete type|
C:\Users\Platino\Documents\C\Crucis\src\window.c|114|error: invalid use of undefined type 'struct sfTexture'|
C:\Users\Platino\Documents\C\Crucis\src\window.c|114|error: dereferencing pointer to incomplete type|
||=== Build failed: 4 error(s), 0 warning(s) (0 minute(s), 1 second(s)) ===|

From what I understand, there is a declaration but no implementation of sfTexture? I'm confused as to why this is an issue. I've been using sfRenderWindow, sfRectangleShape and sfVertexArray from the same header with no problems. Am I doing something wrong? Here is the simplified code from window.c:

Code: [Select]
sfTexture* textures;
int texturenum = 2;

void loadTexture(void)
{
textures = malloc(sizeof(sfTexture*) * texturenum);
memset(textures, 0, sizeof(sfTexture*) * texturenum);

const char* filepath = "../img/tiles.png";
textures[0] = sfTexture_createFromFile(filepath, NULL); // Line 113

if(textures[0] == NULL) // Line 114
ERROR("Failed to load texture: ", filepath);
}

There's a lot going on with loading filenames from JSON and such, but I've triple-checked that's working correctly. Have I not correctly installed CSFML?

3
Graphics / Re: Loading a shader: pre-mature EOF parse error
« on: July 27, 2016, 12:33:52 am »
Raw string literals? Hmm! I've never heard of those. And considering I'm learning C++ as I go, that's not a huge surprise.

Anyways, that works great! Now I can run the executable without an external fog.frag. Thank you for your help. Also, could the tutorial here be updated with this information?

4
Graphics / Re: Spritesheets or individual images?
« on: July 12, 2016, 06:47:57 am »
Like eXpl0it3r said, big textures with many sprites are your best bet. As an artist, I also find them easier to maintain instead of having many smaller images. If you're curious to see the largest texture size your computer supports, try running this:

Code: [Select]
void getMaxTextureSize()
{
int size = 1;
sf::Texture testtexture;
while(testtexture.create(size, size))
{
size *= 2;
std::cout << "Created texture of size " << testtexture.getSize().x << ", " << testtexture.getSize().y << std::endl;
}
system("pause");
}

On me and my brother's computer, the max is a surprising 16384. I imagine mobile devices have a much smaller limit. If anyone knows, what's the expected texture size limit of a lower-end smartphone?

5
Graphics / [SOLVED] Loading a shader: pre-mature EOF parse error
« on: July 12, 2016, 02:53:55 am »
For my game project, I have a fog of war shader that hides out-of-view enemies:

Code: [Select]
uniform sampler2D objtexture;
uniform sampler2D fogtexture;
uniform sampler2D lighttexture;

void main()
{
    // Load textures into pixels
    vec4 objpixel = texture2D(objtexture, gl_TexCoord[0].xy);
    vec4 fogpixel = texture2D(fogtexture, gl_TexCoord[0].xy);
    vec4 lightpixel = texture2D(lighttexture, gl_TexCoord[0].xy);
   
    // Draw objects if a lighttexture pixel is fully-transparent
    // Otherwise, hide objects behind fog
    bool changealpha = bool(ceil(lightpixel.a));
    objpixel = vec4((lightpixel.rgb) * float(changealpha) + objpixel.rgb * float(!changealpha), lightpixel.a * float(changealpha) + objpixel.a * float(!changealpha));
    objpixel = mix(objpixel, fogpixel, fogpixel.a);
   
    gl_FragColor = objpixel;
}

When running the program from Code::Blocks, everything works fine. Running the executable, it isn't finding fog.frag file. I could include the fog.frag in the download, but I'd rather not have users cheat and edit the shader. As a solution, I tried embedding the shader in my program like the example shown here.

After running it through a Notepad++ macro to eliminate any human error, my shader now looks like this:

Code: [Select]
const std::string shaderdata = "uniform sampler2D objtexture;" \
"uniform sampler2D fogtexture;" \
"uniform sampler2D lighttexture;" \
"" \
"void main()" \
"{" \
"    // Load textures into pixels" \
"    vec4 objpixel = texture2D(objtexture, gl_TexCoord[0].xy);" \
"    vec4 fogpixel = texture2D(fogtexture, gl_TexCoord[0].xy);" \
"    vec4 lightpixel = texture2D(lighttexture, gl_TexCoord[0].xy);" \
"    " \
"    // Draw objects if a lighttexture pixel is fully-transparent" \
"    // Otherwise, hide objects behind fog" \
"    bool changealpha = bool(ceil(lightpixel.a));" \
"    objpixel = vec4((lightpixel.rgb) * float(changealpha) + objpixel.rgb * float(!changealpha), lightpixel.a * float(changealpha) + objpixel.a * float(!changealpha));" \
"    objpixel = mix(objpixel, fogpixel, fogpixel.a);" \
"    " \
"    gl_FragColor = objpixel;" \
"}";

Unfortunately, if I compile, I get this error:

Code: [Select]
Failed to compile fragment shader:
Fragment shader failed to compile with the following errors:
ERROR: 0:1: error(#131) Syntax error: pre-mature EOF parse error
ERROR: error(#273) 1 compilation errors.  No code generated

If I print out my shaderdata string, it looks just fine. If I remove all of the comments, line breaks and empty lines, I get the same exact error. Looking at the special characters in Notepad++, all I see are line breaks, tabs and spaces. Could someone please explain what I'm doing wrong?

6
Graphics / Re: [SOLVED] Incorrect scaling of custom VertexArray
« on: July 01, 2016, 01:33:39 am »
Nah. I don't know a lick of OpenGL programming. I'm still pretty green when it comes to SFML. :P
Code: [Select]
for(int col = camera.mintile[y]; col <= camera.maxtile[y]; col++)
{
for(int row = camera.mintile[x]; row <= camera.maxtile[x]; row++)
{
Tile* tile = level->getTile(row, col);
sf::IntRect renderrect = { tile->drawclip.left - camera.camerarect.left,
tile->drawclip.top - camera.camerarect.top,
tile->drawclip.width,
tile->drawclip.height};

tilemap.addTile(&renderrect, &tile->imageclip);
}
}

That's the main loop responsible for drawing just the midground tiles. Before rendering, I pass my map's size and the player's position to the camera class. Camera decides the range of tiles to draw (mintile and maxtile) as well as what part of the map is being drawn (camerarect). Each tile in the map stores its draw position (drawclip) and its location on the texture (imageclip). I pass both of those to my VertexArray, offsetting them with the camera.

I do the same for background tiles, monsters and objects. It might sound excessive having each tile store its own drawclip and imageclip, but I'm making a roguelike so I want things to be as flexible as possible. :D

7
Graphics / Re: [SOLVED] Incorrect scaling of custom VertexArray
« on: June 30, 2016, 05:54:18 pm »
I'm not using sf::View. I have a custom camera class that handles the view and only draws tiles that are intersecting the window.

8
Graphics / Re: Incorrect scaling of custom VertexArray
« on: June 30, 2016, 05:38:21 pm »
OK! I found my solution. After experimenting a bit, it seems I have to resize the tile VertexArray to be smaller if I make my window size larger. Why? I have no idea.

Code: [Select]
sf::FloatRect initsize(0, 0, 640, 360);
sf::IntRect buffersize(0, 0, initsize.width, initsize.height);

... Do some window resizing and update buffersize ...

sf::Transform tiletransform;
tiletransform.scale(initsize.width / buffersize.width, initsize.height / buffersize.height);
sf::RenderStates tilerender;
tilerender.transform = tiletransform;
window.draw(tilemap, tiletransform);

Here, initsize is the initial window size on startup. Buffersize is the window size after resizing. Every time you draw a custom VertexArray directly to a resized window, you need to shrink it by the same amount the window was upscaled. That will produce the desired effect seen in the 2nd image of the OP.

9
Graphics / Re: Texture saved in Ram
« on: June 30, 2016, 05:23:33 am »
Let's say M3TJ20 has a texture that stores the Fog of War of a level on a per-pixel basis. That's something that would be needed to be saved and loaded again later. To do that, export your texture like so:

Code: [Select]
texturename.copyToImage().saveToFile("filename.png");
And then load it again with:

Code: [Select]
sf::Texture newtexture;
newtexture.loadFromFile("filename.png");

And like Limeoats said, do not save a texture in your database if it is going to be the same every time the user runs the program. This should only be done for user-made textures. Also, the user will be able to open the file and edit it. You would need to encrypt it somehow if you don't want the user doing this.

10
Graphics / Re: Incorrect scaling of custom VertexArray
« on: June 30, 2016, 05:13:38 am »
No, I cannot speak fluent Spanish. I would like to get an answer here so other people can learn from it.

It's OK if you can't solve the problem. I'll just wait and see if anyone else has an answer.

11
Graphics / Re: Incorrect scaling of custom VertexArray
« on: June 30, 2016, 03:27:54 am »
And that's where I'm confused. I'm not passing my window size into my shader, but things only work if I put them through the shader. If I pass my tile texture through the shader like this...

uniform sampler2D tiletexture;
uniform sampler2D objtexture;
uniform sampler2D fogtexture;
uniform sampler2D lighttexture;

void main()
{
        // Load textures into pixels
        vec4 tilepixel = texture2D(tiletexture, gl_TexCoord[0].xy);
        vec4 objpixel = texture2D(objtexture, gl_TexCoord[0].xy);
        vec4 fogpixel = texture2D(fogtexture, gl_TexCoord[0].xy);
        vec4 lightpixel = texture2D(lighttexture, gl_TexCoord[0].xy);
       
        // Draw objects if a lighttexture pixel is fully-transparent
        // Otherwise, hide objects behind fog
        bool changealpha = bool(ceil(lightpixel.a));
        objpixel = vec4((lightpixel.rgb) * float(changealpha) + objpixel.rgb * float(!changealpha), lightpixel.a * float(changealpha) + objpixel.a * float(!changealpha));
        objpixel = mix(objpixel, fogpixel, fogpixel.a);
        tilepixel = mix(tilepixel, objpixel, objpixel.a);
       
        gl_FragColor = tilepixel;
}

Then resize my window, this happens:



That is what I want. But if I remove the tiletexture from my shader...

uniform sampler2D objtexture;
uniform sampler2D fogtexture;
uniform sampler2D lighttexture;

void main()
{
        // Load textures into pixels
        vec4 objpixel = texture2D(objtexture, gl_TexCoord[0].xy);
        vec4 fogpixel = texture2D(fogtexture, gl_TexCoord[0].xy);
        vec4 lightpixel = texture2D(lighttexture, gl_TexCoord[0].xy);
       
        // Draw objects if a lighttexture pixel is fully-transparent
        // Otherwise, hide objects behind fog
        bool changealpha = bool(ceil(lightpixel.a));
        objpixel = vec4((lightpixel.rgb) * float(changealpha) + objpixel.rgb * float(!changealpha), lightpixel.a * float(changealpha) + objpixel.a * float(!changealpha));
        objpixel = mix(objpixel, fogpixel, fogpixel.a);
       
        gl_FragColor = objpixel;
}

And then I draw the tiles with a VertexArray and resize the window, this happens:



I don't understand why I have to pass my tiletexture through the shader for it to resize correctly. Right now, I'm passing everything through the shader whether or not the shader is editing it. Once I add in a texture for GUI and a parallax background, I don't want to have to pass those too. :(

12
Graphics / [SOLVED] Incorrect scaling of custom VertexArray
« on: June 30, 2016, 01:05:20 am »
Hello! I'm having some problems with the scaling of a custom VertexArray.

In my engine I have a texture for tiles, objects and lights. Tiles and objects are drawn via a custom VertexArray like the one shown here. Lights are drawn using sf::Shape. I draw the objects, tiles and lights to their textures, then I pass everything through a shader. The result is a fog of war system that masks out of view objects:



If I resize the window all of my textures resize accordingly and I can see a further distance:



Here's my problem: I don't want to draw my tile VertexArray to a texture then pass it through my shader. It seems wasteful because I'm not modifying it in my shader. Instead, I want to draw my tile VertexArray directly to the screen then draw the shader's texture over that. If I do this, it works just fine (like in the first image). The problem is when I resize the window. This happens:



Everything passed through the shader resizes correctly, but the tile VertexArray does not. The larger I resize the window, the further offscreen the tile VertexArray stretches. The smaller I make the window, the less the VertexArray takes up the screen.

If I debug my VertexArray, everything seems to be fine. Each vertex is still 32x32 in size and the larger the screen is, the more vertexes are added. I think SFML might be forcibly scaling my tile VertexArray which I do not want! >:(

For the record, I have also tried rendering my tile VertexArray to a texture, making a sprite from that and rendering it to the window. I get the same bad results.

Pages: [1]