Welcome, Guest. Please login or register. Did you miss your activation email?

Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.


Messages - Guni

Pages: [1] 2
1
Network / Re: Problems sending multiple messages
« on: July 04, 2012, 09:21:56 pm »
Thanks for the insight, binary1248. From what you're saying, it seems like what's happening in my case is that the OS buffer doesn't "turn on" until after I send one packet, and no packets can be sent during this "warmup phase". Then again, this behavior is probably undefined and dependent on hardware.

Practically, I guess this means I cannot make the assumption that if you send a large number of packets, at least one will make it to the other side.

2
Network / Re: Problems sending multiple messages
« on: July 04, 2012, 10:13:25 am »
As far as I know the UDP socket do not care if all packages gonna be delivered.
Instead you can try with TCP.

Yeah I know, I'm just wondering about this odd behavior of UDP packets.

3
Network / Re: Problems sending multiple messages
« on: July 04, 2012, 09:00:23 am »
It seems if I send 1 packet, sleep for 1ms, then send 100 more packets, without any delay in between, all 101 packets get delivered. However, if I do not add that 1ms delay after the first packet is sent, only that first packet gets sent.

So, I tried something else, send 1000 packets without delay. Interestingly enough, the first packet gets sent, the next 130-140 get dropped, then the rest come normally. (I am pretty sure I am not overflowing the send or receive buffers with only 1000 packets).

It seems as though after the first packet gets sent, there is a "warmup period", during which no packet can be sent. Perhaps something is happening at the hardware/OS level that is causing this behavior.

The only experiments I've been able to do on the actual Internet (rather than just using a loopback address), is the code that is in the original post. I get the same results. If I post 100 packets with no delay in between, only the first packet gets sent. If I post 100 packets with a 1ms delay between each packet, every packet gets sent (maybe the 2nd packet, "Message: 1" gets frequently dropped, I don't remember).

4
Network / Problems sending multiple messages
« on: July 04, 2012, 07:02:33 am »
Hi, I have some really simple code that sends 10 consecutive numbers.

Sender:
int main()
{
        std::cout << "Sending\n";
        sf::UdpSocket socket;
        socket.bind(8041);
        socket.setBlocking(true);
       
        sf::IpAddress loopback("127.0.0.1");
        for(int i = 0; i < 100; i++)
        {
                sf::Packet packet;
                packet << sf::Int32(i);
                socket.send(packet, loopback, 8040);
        }
        std::cout << "Done sending\n";
}

Receiver:
int main()
{
        std::cout << "Ready\n";
        sf::UdpSocket socket;
        socket.setBlocking(true);
        socket.bind(8040);
        while(1)
        {
                sf::Packet packet;
                sf::IpAddress address;
                unsigned short port;
                socket.receive(packet, address, port);
                std::cout << "Packet received from " << address.toString() << " port: " << port << "\n";
                sf::Int32 i;
                packet >> i;
                std::cout << "Message: " << i << "\n";
        }
}

When I turn on the receiver, then run my sender, I get this output on the receiver's console:
Ready
Packet received from 127.0.0.1 port: 8041
Message: 0
instead of 100 messages from 0 to 99, which is what I expect.

When I add this line of code to add a slight delay in the sender
sf::sleep(sf::Time(sf::milliseconds(1)));
right after this line:
socket.send(packet, loopback, 8040);

I get every message, except the 2nd message ("Message: 1"), gets mysteriously lost most, but not all of the time. Why do I need this delay here? And any clues on why only the second message keeps getting lost consistently?

5
Network / Re: UDP fragmentation and reassembly
« on: July 02, 2012, 09:31:09 am »
Well, I believe UDP fragments packets at the IP level, and that if even one of those packets is lost, the whole packet is lost. So I'm guessing if you send a large UDP packet using sf::UdpSocket::send(), it will be fragmented at the IP level. But SFML will not know about this fragmentation, and so it cannot do anything about this fragmentation. sf::UdpSocket is just a somewhat thin layer over using UDP sockets directly. Is this correct or not?

6
Network / UDP fragmentation and reassembly
« on: July 01, 2012, 11:09:32 pm »
Hi, I'm using SFML 2.0 release candidate, but I am learning how to use the Network package with the SFML 1.6 tutorial for now.

In this tutorial: http://www.sfml-dev.org/tutorials/1.6/network-packets.php
Quote
The third problem is more network related. Data transfers through TCP and UDP protocols must follow some rules defined by the lower levels of implementation. In particular, a chunk of data can be split and received in several parts ; the receiver must then find a way to recompose the chunk and return data as if it was received in once.
This seems to imply that SFML will handle fragmentation and reassembly of packets if you send too large packets. If you send a large packet with UDP, and it is split up into a number of smaller packets, if even one of those packets fails to send, will the entire message be lost?

7
I am pretty sure this is the case (why else would SFML offer threading if locking was not reliable), but I just wanted some confirmation since there is no guarantee that these operations are atomic in the documentation. I'm looking at Windows, OS X, and Linux.

8
Window / Enabling the depth buffer
« on: March 11, 2012, 03:59:28 am »
My problem was solved by doing by flipping the 0.0f and 1.0f in glDepthRange, so glDepthRange(1.0f, 0.0f); I guess there is something wrong with my GL code. Sorry to bother you.

9
Window / Enabling the depth buffer
« on: March 10, 2012, 11:01:54 pm »
I call
Code: [Select]
glClearColor(0.0f, 0.1f, 0.0f, 0.0f);
glClearDepth(1.0f);
glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);
before every frame, and it still does not work. I suspect this is something wrong on my part, but could there be any other issues regarding SFML?

Edit: Every frame, not every draw call.

10
Window / Enabling the depth buffer
« on: March 10, 2012, 10:44:31 pm »
I definitely did do the whole
Code: [Select]
glEnable(GL_DEPTH_TEST);
glDepthMask(GL_TRUE);
glDepthFunc(GL_LEQUAL);
glDepthRange(0.0f, 1.0f);
thing. I'm pretty sure I have put these OpenGL calls in the right place. I can still draw things to the window, they just won't be properly using depth testing (things behind other objects will appear in front of them). When I put this identical OpenGL code in a freeglut framework, depth testing works.

The reason I suspect is causing this is that while SFML is creating a context and window, it is not attaching depth buffers to the default framebuffer(s) that the context is created with. Am I incorrect?

Edit: This leads me to believe that I must allocate and and attach depth buffer renderbuffers myself using functions like glGenRenderbuffers, then glBindRenderbuffer then glRenderbufferStorage then glFramebufferRenderbuffer.

11
Window / Enabling the depth buffer
« on: March 10, 2012, 09:58:24 pm »
Well like, when I used freeglut, allocating the depth buffer was done via freeglut and since freeglut is cross-platform, it's a cross-platform way to allocate a depth buffer. At least that's how I think it works. Since SFML is supposed to replace freeglut, I figured there would be some function within SFML to allocate the depth buffer. I assumed it was just by setting the DepthBufferBits field in sf::ContextSettings and using that to create a window, but I guess not. Thanks anyway!

Edit: Actually allocating a depth buffer seems to be an OpenGL function, not a freeglut function.

12
Window / Enabling the depth buffer
« on: March 10, 2012, 08:15:41 am »
I'm using SFML 2.0, on Windows 7 Ultimate.

I want to create an OpenGL context that enables the depth buffer. Here is how I'm currently going about it:
Code: [Select]
sf::ContextSettings context(24, 8, 2, 3, 3);
sf::Window window(sf::VideoMode(500, 500, 32), "SFML Window", 7U, context);
{ //Loads OpenGL functions
glewExperimental=TRUE;

GLenum err = glewInit();
if (GLEW_OK != err)
{
 /* Problem: glewInit failed, something is seriously wrong. */
 fprintf(stderr, "Error: %s\n", glewGetErrorString(err));
}
fprintf(stdout, "Status: Using GLEW %s\n", glewGetString(GLEW_VERSION));
}
sf::ContextSettings windowSettings = window.GetSettings();
std::cout
<< "windowSettings.DepthBits: " << windowSettings.DepthBits << "\n"
<<  "windowSettings.StencilBits: " << windowSettings.StencilBits << "\n"
<< "windowSettings.AntialiasingLevel: " << windowSettings.AntialiasingLevel << "\n"
<< "windowSettings.MajorVersion: " << windowSettings.MajorVersion << "\n"
<< "windowSettings.MinorVersion: " << windowSettings.MinorVersion << "\n";
window.SetActive();


The output is what I expect:
Code: [Select]
Status: Using GLEW 1.7.0
windowSettings.DepthBits: 24
windowSettings.StencilBits: 8
windowSettings.AntialiasingLevel: 2
windowSettings.MajorVersion: 3
windowSettings.MinorVersion: 3


I thought creating a window with an sf::ContextSettings that has some number of bits for DepthBits automatically enables the depth buffer. But it seems writing/reading from the depth buffer does not work. When I wrap my exact same OpenGL code around a freeglut framework, rather than an SFML framework, depth testing appears to work.

13
Window / Sending OpenGL commands after window is closed?
« on: March 03, 2012, 09:06:53 pm »
Yes, I am making too much of a relatively unimportant problem. However, if I put the event loop at the end of my main loop, and let's say I stall the main loop if 1/60 of a second has not passed, wouldn't that cause a perpetual "one frame delay" in the visual responsiveness of the game? Again, probably not a particularly important consideration.

14
Window / Sending OpenGL commands after window is closed?
« on: March 03, 2012, 02:04:54 am »
Hi, my question is this: Is it safe to send OpenGL commands when a window has been set as the active window, then closed?

Hopefully that is a yes or no question. If not, here is a more detailed description of my problem.

I have been using OpenGL with SFML for some time now. I am running Windows 7 Ultimate, with SFML 2.0, with static linking. This problem occurs both when I create a context with OpenGL 3.3 and 4.1.

In my main loop, the window polls for input, does its OpenGL commands, then calls Display() on the active window. If during the polling for events, sf::Event::Closed is detected, Close() is called active window immediately.

What this means is that for at least one iteration of the main loop, OpenGL commands are being sent despite the active window being closed.

It looks something like this:

Code: [Select]

 sf::Window window(sf::VideoMode(800, 600), "SFML window");
//initialize OpenGL (set up shaders, etc.)

while (window.isOpen())
{
sf::Event anEvent;
while (window.PollEvent(anEvent))
{
if (anEvent.Type == sf::Event::Closed)
window.Close();
}

//OpenGL commands...
window.Display();
}


After the window is closed, the main loop exits, and the program quits.

This works fine, but I made some additions to my OpenGL code to use Vertex Array Objects and Depth Buffering, without changing any SFML-related code and now when I close the window, instead of exiting smoothly, the program crashes. Debugging this shows that sending a draw command then modifying a uniform variable causes the program to crash. This is kind of unusual, since OpenGL commands themselves do not generally cause a program to crash.

To workaround this I have had to modify the main loop a little bit.

Code: [Select]

while (window.isOpen())
{
sf::Event anEvent;
bool isOpen = true;
while (window.PollEvent(anEvent))
{
if (anEvent.Type == sf::Event::Closed)
isOpen = false;
}

//OpenGL commands...
window.Display();
if (!isOpen)
window.Close();
}


Now, OpenGL commands are never being sent to a closed window.

15
General discussions / Need GLEW for SFML?
« on: February 20, 2012, 10:19:46 pm »
Thanks

Pages: [1] 2
anything