Welcome, Guest. Please login or register. Did you miss your activation email?

Author Topic: Laptop get hot when running a simple tileset  (Read 3081 times)

0 Members and 1 Guest are viewing this topic.

magneonx

  • Full Member
  • ***
  • Posts: 141
    • MSN Messenger - magnumneon04@hotmail.com
    • View Profile
Laptop get hot when running a simple tileset
« on: July 31, 2013, 11:36:10 am »
I just wanna ask if this is normal? I have a quad core Intel i7 and I have GT640m GDDR3 2gig VRAM. I am running my program on my laptop. and each core has 59-73C of temparature, I am just running a very simple 2d tileset. Is this a fault in my part or my laptop maybe? I can play Torchlight II with temps at 60-70C degress,  and crysis at 70-80C and this is just a simple 2d tile sprite.

I am just afraid this will go hotter if I put more in my game screen. Is this normal? Or just my program sucks.
Thanks!

Lo-X

  • Hero Member
  • *****
  • Posts: 618
    • View Profile
    • My personal website, with CV, portfolio and projects
Re: Laptop get hot when running a simple tileset
« Reply #1 on: July 31, 2013, 11:58:33 am »
I don't know how is done your program, there's perhaps improvement to do (since there's always improvements to do with tilemaps :p ) but I don't think it'll be going hotter with some more content.

Do you have a fixed timestep ? (fps)
Is vertical-sync enabled ?

Heat in a laptop is always a problem. When you open a renderwindow openGL is activated and so is you GPU. And it's often a part that get hot pretty fast in a laptop. Plus the CPU is working too.

eXpl0it3r

  • SFML Team
  • Hero Member
  • *****
  • Posts: 11030
    • View Profile
    • development blog
    • Email
Re: Laptop get hot when running a simple tileset
« Reply #2 on: July 31, 2013, 12:13:09 pm »
Well what's your CPU and GPU load then?

If you don't limit the frame time, your application will run as fast as possible and maxing out one core of your CPU.
Official FAQ: https://www.sfml-dev.org/faq.php
Official Discord Server: https://discord.gg/nr4X7Fh
——————————————————————
Dev Blog: https://duerrenberger.dev/blog/

magneonx

  • Full Member
  • ***
  • Posts: 141
    • MSN Messenger - magnumneon04@hotmail.com
    • View Profile
Re: Laptop get hot when running a simple tileset
« Reply #3 on: July 31, 2013, 12:40:31 pm »
Hello! Thanks! I don't know what V-sync does, but sure hell I turn it off when I play crysis and gives me better FPS. I don't what it does on 2d setting. I'll try that.

This is something I kept asking about, what is fixed-timestep? I see, I think I have done this a long time ago, is that when you make a system of FPS calculation where you balance the FPS across different CPU speeds? I am actually making a GridEditor, I need this for my RTS game.

I don't use XML so I just use simple text file which contains the grid data and I also make the corresponding CollisionGrid based on the loaded tileset.

I haven't done any kind of fix-timestep on this particular prototype.

Also I don't know how to see GPU Load nor how to read one, I am just using widgets and HWMonitor to see stuff. GPU-Z crashes on me along with CPU-Z.

Lo-X

  • Hero Member
  • *****
  • Posts: 618
    • View Profile
    • My personal website, with CV, portfolio and projects
Re: Laptop get hot when running a simple tileset
« Reply #4 on: July 31, 2013, 01:38:02 pm »
Okay, so here is your problem =)

FPS : the number of frame per second your game displays (you know that, but keep it in mind)
V-Sync : In order to make it simple : that's a option that tell the GPU to set the framerate the same that the screen speed. Most screens are 60Hz, that mean 60fps (60 images can be displayed each second).

So, the v-sync allow your computer to rest a bit. Indeed, instead of sending 400 (arbitrary number) frame per seconds to a CPU then to a screen that can only display 60 frames (you lose at most 340 frames here), it only send 60 frames (more or less, that's an average).

Activating it depends on the game and the computer :
- If you have a low computer with a new new game with amazing realistic graphics, well, it will be pretty slow anyway. But if you activate the v-sync, the GPU (then CPU) will try to have 60fps because your screen can. In that case it's maybe not a good idea.
- If you have a common or good computer and making a 2D game or a game that do not need the last graphic card, then your computer will not have any problem to have 60fps. So you will save some CPU and GPU (less work = less heat).

Look at sf::RenderWindow documentation to activate v-sync.


Fixed timestep is a vast subjet (but yet pretty simple to implement). I can only tell you to read some articles or books, there are plenty of them around the web.

Plus, SFML have a sf::RenderWindow::setFramerateLimit() that kinda allow to have timesteps, but I'm not sure that's the most efficient way to do it (I'm maybe mistaken, I just do not use it)

There are also articles or people around here that can tell you better than me the pro and the con of v-sync or the best way to have fixed timesteps. But here is the idea.

magneonx

  • Full Member
  • ***
  • Posts: 141
    • MSN Messenger - magnumneon04@hotmail.com
    • View Profile
Re: Laptop get hot when running a simple tileset
« Reply #5 on: August 04, 2013, 10:59:27 am »
Thanks for the great info guys! I am going to do that once I am done with my AI Path Finding, I have actually read some articles about Fixed-time steps and I think my game won't require a fast computer, so I am afraid I am going to stick with the DeWitters' Constant Game Speed with Maximum FPS and I think I have realized it is about tradeoffs, so I picking that one. Indeed it is a very very large subject.

And now I am going to turn on my Vsnyc for this game.

Thanks guys!

Lethn

  • Full Member
  • ***
  • Posts: 133
    • View Profile
Re: Laptop get hot when running a simple tileset
« Reply #6 on: August 04, 2013, 01:02:33 pm »
The other guys have given some very detailed explanations but you should post your laptops specifications if you're having trouble running your game. When you mentioned laptop it set off my warning signals :D most laptops should run 2D games fine but you never know with the damn things unless you know all the hardware it has you won't know whether it's a coding problem or a simple hardware problem.

I'm sure if you're a tech person you'll already know about this but the manufacturers who sell laptops generally tend to know nothing about making a nice PC especially the gaming laptops. They'll put a fancy graphics card on it and advertise that but when you look closer you'll discover they've skimped on the CPU or the RAM.
« Last Edit: August 04, 2013, 01:04:55 pm by Lethn »

magneonx

  • Full Member
  • ***
  • Posts: 141
    • MSN Messenger - magnumneon04@hotmail.com
    • View Profile
Re: Laptop get hot when running a simple tileset
« Reply #7 on: August 04, 2013, 03:20:24 pm »
If you are reading my original post I have already specified my laptop specs. Its a core i7 GT640m 2gig VRAM GDDR3 and I have 4gigs of RAM.

Maybe its just my game, its just get hot somehow, I have yet to try FTS. But for now I have to solve my current problem at hand...

Lethn

  • Full Member
  • ***
  • Posts: 133
    • View Profile
Re: Laptop get hot when running a simple tileset
« Reply #8 on: August 04, 2013, 05:52:39 pm »
Oh crap! Lol I didn't see it, I'm used to seeing them laid out in a list :D sorry about that, looking at those specs it looks like it could well be your game a laptop like that shouldn't have much trouble with 2D games unless you have a really outdated graphics card but even then.