SFML community forums

Help => General => Topic started by: rlwarrior32 on December 23, 2014, 08:36:24 pm

Title: Running server causes client to use more cpu, even when not connected.
Post by: rlwarrior32 on December 23, 2014, 08:36:24 pm
Hello,

I am using SFML-2.2, visual studio 2013.

Now I have a client and a separate server program. When I run the client and have it sit at the login screen it uses between 0-1 percent cpu.

Now if I start the server program, JUST start it, don't open any sockets, or connect to anything, suddenly the client program uses over 20 percent cpu, when I didn't even touch the client. If I then close the server program, the client goes back to using 0-1 percent cpu.

Any ideas why this would be happening?

Thanks!
Title: Re: Running server causes client to use more cpu, even when not connected.
Post by: Ixrec on December 23, 2014, 08:39:25 pm
I think we'd need to see complete & minimal client/server code to help with this, since it's more likely to be a bug in your code than anything else.  Also, are they running on the same machine?  Same executable?  Same process?  If they're on separate machines this does seem pretty impossible.