PS4 and Xbox One vs 1080p, 60 FPS: Numbers Don’t Always Tell the Whole Story

PS4 and Xbox One vs 1080p, 60 FPS: Numbers Don’t Always Tell the Whole Story

With the approach of the release dates of the PS4 and the Xbox One the specifics of many of the most anticipated games on both consoles are starting to solidify while others are being rumored. Many times they don’t cause joy and expectations, but instead they instigate rage and hostility when a title fails to hit the “magical” numbers 1080p and 60 frames per second.

When I read the reactions every time a development studio announces that its game is going to miss one of these apparently much coveted goals, on both sides of the console war’s battlefield, it’s hard not to feel that the PS4 and the Xbox One aren’t fighting each other but are instead fighting the numbers that embody people’s expectations.

To be fair the numerical expectations aren’t born out of thin air. 1080p has been a relatively standard resolution for PC gaming for a long while (on the other hand stable 60 FPS or more are reserved to fairly powerful PCs, especially with visually intensive games), but more importantly the 1080p/60 fps combo has been a not so silent, and most definitely not fulfilled, promise made by many developers at the beginning of the previous generation.

That’s why it’s not so surprising to see many gamers disappointed by the fact that such a promise won’t be largely fulfilled even at the beginning of this upcoming generation.


Those developers and publishers that made that promise years ago made a mistake, as they shifted our attention on details that are easily quantifiable and look great when plastered all over your presentation, but don’t really tell the whole story about a game’s visual quality. As a matter of fact they could be considered almost marginal.

Don’t get me wrong: 60 frames per second are very nice to have, especially for the quickest and twitchiest of games, and that’s why most developers tend to prioritize frame rate over resolution. That said, they’re not really needed in slower paced games, in which most gamers won’t really notice much of a difference between 30 and 60.

1080p resolution is even more marginal by itself in achieving a pleasant visual impact, especially considering the fact that next generation consoles will upscale the native resolution of games to 1080p, reducing the gap between different pixel counts further. If you want to see an example, you can check out a simulation I ran with Battlefield 4. Of course the difference exists, but as you can see from that article, I had to go to quite some length to help people even notice it. And we’re talking about stills. Just think how much harder it is to see those details when everything is moving on the screen six feet from your face.

Another numerical element that is somewhat less “popular” (but still quite heavily discussed) than frame rate and resolution is polygon count. Many seem to think that a higher polygon count automatically means better visuals. Any 3D artist and modeler could tell you that it’s not true, as a very high triangle density normally just means that the model isn’t optimized.


When creating a 3D model on a professional game development environment, the first prototypes are normally pretty rough, featuring a lot of spurious areas, and even polygons that are inside the model itself, invisible from view. They also lack high resolution normal mapping, that normally makes a lot of the raw polygons simply redundant. That’s why models go through stages of refining and “retopologizing,” during which all the unnecessary polygons are shaved off without impacting visual quality, releasing hardware resources that can be used in other features and effects.

The concept of resources is an important one. Make no mistake: no matter how powerful your console is, the computational and rendering resources it can provide will always be finite. This means that developers will always have to juggle between features and implement the best trade off between resolution, frame rate, polygon count and other, less quantifiable elements like effects, antialiasing, lighting and shading, number of characters on screen, draw distance, texture resolution, shaders and so forth.

All those elements contribute to the visual fidelity and to the spectacle offered by a game, even if they aren’t as easily represented by raw numbers. As a matter of fact, many of them influence the final beauty of a title more than resolution.

Lighting, for instance, is probably the single most important element in determining the final quality of a game’s visuals. A game could be rendered in 4K resolution, have millions of polygons and run locked at 120 FPS, but a sub-par lighting engine would make it look like absolute crap, with models on screen that would seem flat and artificial in front of your eyes.


On the other hand, a developer that sacrifices a few thousands pixels to implement a more powerful and spectacular lighting engine will most probably produce a game that ultimately looks a lot better. Of course you can’t go too low in resolution before the visuals start to really degrade but it’s a trade off, and one of the most important tasks in development is hitting the right balance.

Another very easy example to envision is draw distance. In most last generation games that see you travel exterior environments (and some time interiors too), you’ll notice that far away objects are hidden from view by some sort of mist, called Distance Fog. Its purpose is exactly to limit the draw distance so that the hardware has to render less models and do less work, keeping a decent frame rate and level of stability.

Looking at the footage of many next generation games, that Distance Fog has been either removed or placed much farther away. This should not be underestimated, as those objects need to be rendered by the hardware, and result in a much higher overall polygon count and many more textures that need to be stored in the memory.

The less theoretical effect is that the vistas we’ll be able to enjoy will be much bigger and more beautiful, letting us see much father away, like we do in the real world. That’s most definitely not a small graphical improvement by any stretch of the imagination.


The same could be said about many other elements and effects, that can’t be easily expressed in number like “1080p, 60 fps,” but are often more relevant than raw resolution and frame rate to the final result you’ll see in front of your eyes. And yes, developers will most possibly sacrifice resolution and/or frame rate in order to implement elements and effects that, in the end, simply make the game look better, numbers be damned.

I hear quite often people saying that a game is “not next gen!” when it doesn’t hit 1080p and 60 fps, but “next gen” isn’t defined just by numbers. While the hardware resources provided by the upcoming PS4 and Xbox One can’t compete with a top-end gaming PC that’ll easily cost you three or more time as much (and why should they, really? you get what you pay for), they do provide a very steep increase in performance, computational and rendering power compared to the PS3 and the Xbox 360.

At times all that raw power will go towards resolution and framerate, but many times it will be invested in having bigger worlds in which you can see much farther, have better lighting that makes everything looks more natural and “alive,” fight smarter enemies designed to challenge us much more effectively, and so forth.

1080p and 60 FPS look very good on paper, but we don’t play papers. We play often extremely complex entities called games, that are made by a lot more than simple numbers. Maybe it’s time to look at the whole picture instead, and be excited again. I guarantee that it’ll make the beginning of the next generation more enjoyable for us all.