Help => General => Topic started by: sergiones on April 13, 2013, 08:18:12 pm
Title: sf::Vector2f limit
Post by: sergiones on April 13, 2013, 08:18:12 pm
Hi !
I'm having trouble knowing why my program keeps sending me Segmentation Fault when I set a vector type
sf::Vector2f[number_of_vectors] when number_of_vectors exceeds 1.100.000.
I was thinking about making a massive particle simulation with sf::VertexArray points1( sf::Points, number_of_vectors) and worked fine until I tried to create veleocity vectors with sf::Vectors vel[number_of_vectors].
Can anyone guess why VertexArray manages to exceed easily 300.000.000 but the vector2f not even 1.100.000 ?
Thanks I'm quite newbie in SFML :P
I'm using SFML 2.0 by the way...
Title: Re: sf::Vector2f limit
Post by: Grimshaw on April 13, 2013, 08:32:52 pm
There are no limits for an array size, virtually.
You re probably getting an exception of std::bad_alloc because you re allocating a huge array and you dont have enough contiguous memory free. Try again with less programs open, or check if you have enough RAM, or simply allocate smaller arrays.. you can also divide the particles among multiple smaller arrays instead of a big one.. Try and give us some feedback! :D
Title: Re: sf::Vector2f limit
Post by: FRex on April 13, 2013, 08:35:37 pm
This has nothing to do with SFML.
I'm assuming you're not newing that array. Vertex array uses std::vector which uses dynamic memory from heap which is unlimited* and stack space(for things like arrays of static size, local variables, function calls) is very limited(around couple of MBs, system, compiler, settings etc. dependant, might seem like much but 1024^2 ints or floats is 1 MB) and not mandated by standard. Bjarne Stroustrup recommends not assuming anything about how much you're given there and use handles to heap when you need large amounts of space.
*Ram is physically limited, but running out of that is quite hardcore and system might start swapping to fake even more. Also running out of heap space is recoverable from(you can start catching the bad_allocs that get thrown and start freeing), while running out of stack space is not(I think) because your program insta-dies with seg fault or stack corruption(or something similar sounding :P) because no more autos or calls can be made at all.
Title: Re: sf::Vector2f limit
Post by: sergiones on April 13, 2013, 09:28:58 pm
Thanks you the replys!
Well I think the RAM is not the guilty this time. I have 8 GB, and in the system monitor the program "only" used 70 MB for 2.000.000 vector points.
Also, I tried changing the vector for a struct. Worked ! ... but only loading. Now when it process the data I get the same Segmentation Fault.
It seems like I going to have to split the vectors in different chunks ... any ideas ?
Title: Re: sf::Vector2f limit
Post by: FRex on April 13, 2013, 09:38:27 pm
Are you using stack space or not?
Title: Re: sf::Vector2f limit
Post by: sergiones on April 13, 2013, 09:51:10 pm
Well I was willing to use Vector2f from SFML which is dynamic but my intention is to keep the same amount of vectors and only modify their values so I didn't care much.
Title: Re: sf::Vector2f limit
Post by: FRex on April 13, 2013, 09:56:13 pm
You're(probably) running out of stack space, so use std::vector<sf::Vector2f> and see if it helps.
Title: Re: sf::Vector2f limit
Post by: sergiones on April 13, 2013, 10:13:32 pm