Thursday, 16 February 2012

On framerate

In recent discussions I've been involved in I've noticed that developers have a lot of ideas about the best frame rate for a game. Unfortunately many of those ideas are wrong, often for very straightforward reasons. As I've been thinking about this again for the game I've started I thought it worth a post on this.


The main point is there are only two frame rates you should consider: 30fps and 60fps. And 90% or 95% of the time you should choose 30fps. Why?

All displays have a refresh rate, the speed or frequency with which they update the screen. This used to vary greatly, as CRTs were cranked up to refresh rates of 100Hz or more to eliminate flicker. Games followed this by supporting very high frame rates if the computer, display and graphics card would allow it.


But LCDs don't have a problem with flicker, and need not run at such a high rate. Instead they almost all run at 60Hz, fast enough that everything on-screen appears smooth and responsive.


There's never any reason to use a higher frame rate than the screen refresh rate. If you e.g. have a frame rate of 80fps, so are sending 80 updates to the screen every second, and the frame rate is 60Hz, then at best 20 of those updates will be thrown away. At worst you'll see tearing where in trying to render too many frames some get stitched together and noticeable flaws or "tears" appear at the join.


Less obviously it's also a bad idea to use frame rates close to but below 60fps. If for example a frame rate of 50fps is used then that's ten less than the number of updates the screen needs. Some of those frames will be displayed once, some twice, leading to uneven animation, or some explosions appearing brighter as they are on screen longer.


So if the frame rate is 60fps a game will display smoothly. If frame rate is 30fps each frame is displayed twice with a 60Hz refresh rate, so with equal intervals between them and with each onscreen for the same time, again appearing smooth.


As for which is best, 30fps is fine in almost all situations. The main reason is it's so common that no game will be punished for using it. All TV programs and films, the vast majority of console games and portable games, run at 30fps or less. Cinema films run at 24fps and no-one thinks they need to run at 2.5 times this rate.


30fps gives you more than twice as long to do your updates, as not only is there longer between updates but there's less overhead processing them. Even if you think you could manage 60fps running at half this rate makes your game more scalable, so it runs well on old machines or on sites where the JavaScript of Flash adverts grab large chunks of processing power.


The next frame rate that's a multiple of 60 is 20fps. I would avoid this though, as being lower than all other games and media it's noticeable, and will make a game seem slow and poorly optimised. The same goes for slower rates such as 15fps, 12fps and 10fps, unless it is done deliberately and probably temporarily for effect.


But: if you think your game is lightweight enough, you can eliminate all bottlenecks that make it slow, and there's a gameplay reason for it then it's worth considering 60fps. Flash doesn't have to be slow and inefficient, and especially with 2D games there's rarely a need to max out the CPU cycles at 30fps.

No comments:

Post a Comment