I have what I believe is a fairly solid argument in regards to the issue of PeekMessage() vs GetMessage().
Let's assume we have a specific goal: When the user minimizes the game, we want the game to consume 0% CPU. By 0% I mean time spent in our application/game code. This does not include the processing time the operating system spends managing our process. Let's keep it simple.
Below I've outlined 2 scenarios. Scenario 1 doesn't reach our goal at all, however it reflects the current design that SFML imposes on the user. Scenario 1 is being presented because I want to express how SFML could not possibly fulfill this very simple but very important design goal in its current state (architecture).
Scenario 2 will indeed solve the problem, however it utilizes an architectural design that is completely different/incompatible with SFML. This is basically the design that kfriddle is pushing for.
Scenario 1Suppose the following game loop implementation (Forgive/Ignore any over-simplifications, subtle bugs, or other anomalies, this code has not been compiled):
int main()
{
MSG msg;
while( true )
{
if( PeekMessage( &msg, NULL, NULL, NULL, 0 ) )
{
if( msg.message == WM_QUIT )
{
break;
}
TranslateMessage( &msg );
DispatchMessage( &msg );
}
TickGame();
DrawGame();
}
return 0;
}
The above code represents an over-simplified version of your typical "main game loop". This is basically the thing that feeds your entire game and provides it continuous processing. When the user minimizes the application, there is no way to suspend the application. In other words, the
while( true ) loop above will never end except for when the application is terminated by the user.
Because this while loop never ends except under the previously noted circumstances, the game will always push to consume as much CPU as possible regardless of the state of the application, such as being minimized.
You may say, "Well let's just do this:"
int main()
{
MSG msg;
while( true )
{
if( PeekMessage( &msg, NULL, NULL, NULL, 0 ) )
{
if( msg.message == WM_QUIT )
{
break;
}
TranslateMessage( &msg );
DispatchMessage( &msg );
}
if( bMinimized )
{
Sleep( 1 );
}
else
{
TickGame();
DrawGame();
}
}
return 0;
}
I would then proceed to say you're evil. This is not solving the problem. Sleeping here does not solve the problem since you have no idea how long the user will keep the application in the minimized state. For the entire duration the application is minimized, there should be absolutely 0 iterations of this while loop. No physical code that we control in the application should be getting processed. Any processing happening at application level code is considered wasteful.
Scenario 2The code for this can get fairly extensive, so I'll only cover the most fundamental and important parts. In one thread (Thread #1), you would have this continuously running:
MSG msg;
while( GetMessage( &msg, 0, 0, 0 ) > 0 )
{
TranslateMessage( &msg );
DispatchMessage( &msg );
}
Obviously the window would have been constructed in the same thread processing the above loop. As far as where the message procedure is, let's also assume it is in the same thread for the purposes of this example.
In a completely different thread (Thread #2) you would have the following loop running:
while( true )
{
WaitForSingleObject( .... ); // This would suspend the thread if game processing is not currently needed.
TickGame();
DrawGame();
}
Again, I do apologize for the over-simplifications. Bear with me. This is mainly pseudo-code. The above code continues to process the game normally until a request from another thread comes in to tell it to PAUSE or RESUME (hence the WaitForSingleObject() call). If this thread is told to PAUSE, no game processing will occur until a matching RESUME request is given.
So let's tie all of this together. Typically I would use a sequence diagram to properly document the flow of all of this, so once again do bear with me while I try to use a bullet point list to describe the sequence of the application:
[list=1]
- The application starts and Thread #1 is executed, which results in the message pump being processed.
- When the user minimizes the application, a WM_MINIMIZED event occurs, which Thread #1 handles by atomically telling Thread #2 to PAUSE.
- During this time, the application is in the minimized state consuming 0% CPU because the game loop in Thread #2 is not running, nor is Thread #1 continuously spamming calls to PeekMessage().
- When the user restores/maximizes the window, the respective message is handled and results in a RESUME request being sent to Thread #2, which causes the game loop to continue processing.[/list:o]
Conclusion
As kfriddle has been saying all along, the design SFML utilizes (Scenario 1) is antiquated. I won't go over his arguments since they were well spoken. I'm simply giving a detailed side-by-side example to make his points a bit more realized. There is no sense in justifying memory leaks, many have already told you this is just plain EVIL in every sense of the word. There is also no sense in justifying usage of polling, as it greatly limits the conservativeness of the application as I've just explained in detail.