Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: Running server causes client to use more cpu, even when not connected.  (Read 1003 times)

0 Members and 1 Guest are viewing this topic.

rlwarrior32

  • Newbie
  • *
  • Posts: 5
    • View Profile
Hello,

I am using SFML-2.2, visual studio 2013.

Now I have a client and a separate server program. When I run the client and have it sit at the login screen it uses between 0-1 percent cpu.

Now if I start the server program, JUST start it, don't open any sockets, or connect to anything, suddenly the client program uses over 20 percent cpu, when I didn't even touch the client. If I then close the server program, the client goes back to using 0-1 percent cpu.

Any ideas why this would be happening?

Thanks!

Ixrec

  • Hero Member
  • *****
  • Posts: 1241
    • View Profile
    • Email
Re: Running server causes client to use more cpu, even when not connected.
« Reply #1 on: December 23, 2014, 08:39:25 pm »
I think we'd need to see complete & minimal client/server code to help with this, since it's more likely to be a bug in your code than anything else.  Also, are they running on the same machine?  Same executable?  Same process?  If they're on separate machines this does seem pretty impossible.

 

anything