Hello,
I'm relatively new to network programming. So far I have created a server program and a client program. Everything has been going well since I've started. Multiple clients can connect to the server and send packets back and forth.
Everything has been going fine until last night. During testing I had a client connect to my server. I let the client program sit idle for about five minutes. When I attempted a ping command I ended up getting 300+ms ping, and this was on a local connection. Another ping command right after resulted in the typical 0ms ping. If I let the client idle for ten minutes and attempt again the packet never reaches the server.
At this point there isnt a continuous back and forth, no "keep-alive" packets or anything. I would send a packet when needed, no making sure a client is still around.
Now my question is, why do you think this would be the case? If I continuously talk on the client I never lose a connection, only when I idle. I am thinking this is how it should be, dropping a connection if nothing is heard for awhile. I am at work and can't post any code now but it is essentially what you see in the tutorials.
I am using 1.6 and TCP packets through a Selector, if that matters. Any theories or suggestions would be great.