Welcome, Guest. Please login or register. Did you miss your activation email?

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - ast

Pages: 1 [2]
16
Feature requests / Re: y-axis direction
« on: October 21, 2015, 11:00:12 pm »
Thanks for answers and good discussion!

My statement that the y-axis direction makes troubles to vector algebra is slighly incorrect -- the algebra does not care about y-axis pointing 'up' or 'down'. It is the interpretation of results of the calculation where it matters. In my case I do game physics, not terribly hard, but having enough forces and torques acting the equations tend to become large. My choice was to have the gravitation force to be 'negative' and to have the contact reaction forces being 'positive', where the negative and positive means the orientation of the force vector with respect to y-axis. So I wanted the physics being normal and chose to cope with the transformations when drawing on the screen. I still think this was the right decision.

I also had the idea that the transformations take place only when rendering to screen but that ended up being not entirely true. It would if the only interaction with the screen co-ordinates would be the drawing. But I sometimes query a screen object rotation angle e.g. to find out the direction where its surface normal points to, and while doing so it is important to remember the different direction of the positive angle. An another example is that in order to interact with Thor particle system particles I need to get their velocities and it is of course returned as a vector in screen co-ordinates. It is obvious that the velocity vector needs a different transformation than the position vector but only after first making a mistake there :).

Also, the scaling of the length unit is another reason to have the different co-ordinates for calculation and rendering. Instead of defining acceleration as pixels/s2 I opted for more common m/s2, where one pixel is defined to correspond to a certain length measured in 'meters'. This results all physics in SI-units and it is much easier find good ballpark figures for various quantities because the real world can be used as a reference case.

So the total transformation is scaling, flipping the y-axis and translating the origin to bottom left corner. If the origin and the y-axis direction would have been in place already, only the scaling would have been needed.

The roots of the downward y-axis selection is very well understandable according to CG history and convention. And while this is manageable and only a slightly bugging sometimes, it would not harm if there would be a standard way of handling these to make it easier for newcomers and also easier to write at least a bit less cluttered code.

The ultimate solution would have been if the ancient greeks would have invented Commodore64 before maths so they could have had the downward-y set correctly in their maths too  ;)

17
Feature requests / y-axis direction
« on: October 20, 2015, 10:57:47 pm »
Hi all,

This is not a direct feature request but more of on inquiry. Does anyone else find it at least a little annoying that the y-axis of sfml points downwards and not upwards as in the common maths? I am aware that this has historical background going back to someone deciding the cathode ray beam should start drawing lines from the upper left corner of the tube.

Anyways, the current choice of the co-ordinates seems to pose all sorts of troubles with vector algebra and the fact that angle positive direction is counted in clockwise direction contrast to the more common counter-clockwise direction. To get things working I implemented transforms back and forth from these downward-y 'screen' co-ordinates to my own co-ordinate system, in which I chose to have the origin in bottom right corner and the y-axis pointing upwards. One caveat though is that the position vector must be transformed with different transformation than its derivatives like velocity, acceleration or force because the location of the origin affects only to the position vector.

While this is manageable, I seem to end up having quite a lot of transformations while interacting with sfml functions, which could be avoided if the co-ordinates of sfml has been chosen otherwise. The first question is, how do you others manage with your co-ordinates? And the second, would it be feasible to support the upward-pointing y-axis in sfml also?

This is about the only thing that sometimes bugs me with sfml, otherwise it pretty much does what I would expect it to.

Thanks,
ast

18
General / Re: How to extend Thor particle system class? Monkey patching?
« on: October 10, 2015, 09:29:12 am »
Hi,

Thanks for the answer. I have other aspects to work with so I am in no hurry. It is nice to know that the issue I bumped into was valid and being addressed already. The tip of exploiting some of the existing data of the particle to store the data I wanted may work also. I might even give it a shot. In my experience, both the SFML and Thor has pretty much always worked as I would have expected them to, so thanks for the good libraries  :).

Regards,
ast

19
General / How to extend Thor particle system class? Monkey patching?
« on: October 03, 2015, 10:38:05 pm »
Hi all,

I did some experimentation with Thor particle systems and was faced with the following. My collision detection schema needs to save state information to each object that can collide. Therefore, in order get the Thor particles collide, I would like to extend the thor::Particle to contain the new attributes for the collision state information, but of course not to change the Thor library files in any way. Is there a feasible way to do this?

Being not a seasoned C++ expert, I come up only deriving ParticleExtended from Particle and include the things I want in the derived class. But as I am instantiating a ParticleSystem and not a Particle this probably would not work this simple. The other option I can think of is to maintain a map from a particle pointer (Particle*) to my collision state data and in my CollisionAffector I could retrive the correct data for each particle using the map. But this seems a bit awkward too. I would be glad to hear if anyone could suggest other alternatives.

Thanks,
ast

20
Graphics / Color blending problem with shaders
« on: April 02, 2015, 09:25:49 pm »
Hello all,

I have done some experimenting with SFML and shaders. The goal is to make a sprite to be surrounded with a colored glowing light. Following the example in 

http://thedaftdev.com/entry-13-SFML-and-shaders.-.html


I manage to get the shader working. I draw first all sprites and a green background to a RenderTexture, from which I take the drawn texture with getTexture() and make it a sprite. Then I draw this sprite to RenderWindow using the shader and blendMode BlendAdd. This all works well but the glowing light is supposed to be red but when it is blended with green background it appears to be yellow. With black background the light is red. This is all fine, the colors are numbers and add up like that, but the question is, how I can make the light to be the color I want regardless of the background? I am sure this is a well known problem and has good answers already, but they seem to be hard to find.

Thanks for the great library, SFML is awesome!

Regards,
ast

Pages: 1 [2]