PS4 and Xbox One vs 1080p, 60 FPS: Numbers Don’t Always Tell the Whole Story

With the approach of the release dates of the PS4 and the Xbox One the specifics of many of the most anticipated games on both consoles are starting to solidify while others are being rumored. Many times they don’t cause joy and expectations, but instead they instigate rage and hostility when a title fails to hit the “magical” numbers 1080p and 60 frames per second.

When I read the reactions every time a development studio announces that its game is going to miss one of these apparently much coveted goals, on both sides of the console war’s battlefield, it’s hard not to feel that the PS4 and the Xbox One aren’t fighting each other but are instead fighting the numbers that embody people’s expectations.

To be fair the numerical expectations aren’t born out of thin air. 1080p has been a relatively standard resolution for PC gaming for a long while (on the other hand stable 60 FPS or more are reserved to fairly powerful PCs, especially with visually intensive games), but more importantly the 1080p/60 fps combo has been a not so silent, and most definitely not fulfilled, promise made by many developers at the beginning of the previous generation.

That’s why it’s not so surprising to see many gamers disappointed by the fact that such a promise won’t be largely fulfilled even at the beginning of this upcoming generation.


Those developers and publishers that made that promise years ago made a mistake, as they shifted our attention on details that are easily quantifiable and look great when plastered all over your presentation, but don’t really tell the whole story about a game’s visual quality. As a matter of fact they could be considered almost marginal.

Don’t get me wrong: 60 frames per second are very nice to have, especially for the quickest and twitchiest of games, and that’s why most developers tend to prioritize frame rate over resolution. That said, they’re not really needed in slower paced games, in which most gamers won’t really notice much of a difference between 30 and 60.

1080p resolution is even more marginal by itself in achieving a pleasant visual impact, especially considering the fact that next generation consoles will upscale the native resolution of games to 1080p, reducing the gap between different pixel counts further. If you want to see an example, you can check out a simulation I ran with Battlefield 4. Of course the difference exists, but as you can see from that article, I had to go to quite some length to help people even notice it. And we’re talking about stills. Just think how much harder it is to see those details when everything is moving on the screen six feet from your face.

Another numerical element that is somewhat less “popular” (but still quite heavily discussed) than frame rate and resolution is polygon count. Many seem to think that a higher polygon count automatically means better visuals. Any 3D artist and modeler could tell you that it’s not true, as a very high triangle density normally just means that the model isn’t optimized.


When creating a 3D model on a professional game development environment, the first prototypes are normally pretty rough, featuring a lot of spurious areas, and even polygons that are inside the model itself, invisible from view. They also lack high resolution normal mapping, that normally makes a lot of the raw polygons simply redundant. That’s why models go through stages of refining and “retopologizing,” during which all the unnecessary polygons are shaved off without impacting visual quality, releasing hardware resources that can be used in other features and effects.

The concept of resources is an important one. Make no mistake: no matter how powerful your console is, the computational and rendering resources it can provide will always be finite. This means that developers will always have to juggle between features and implement the best trade off between resolution, frame rate, polygon count and other, less quantifiable elements like effects, antialiasing, lighting and shading, number of characters on screen, draw distance, texture resolution, shaders and so forth.

All those elements contribute to the visual fidelity and to the spectacle offered by a game, even if they aren’t as easily represented by raw numbers. As a matter of fact, many of them influence the final beauty of a title more than resolution.

Lighting, for instance, is probably the single most important element in determining the final quality of a game’s visuals. A game could be rendered in 4K resolution, have millions of polygons and run locked at 120 FPS, but a sub-par lighting engine would make it look like absolute crap, with models on screen that would seem flat and artificial in front of your eyes.


On the other hand, a developer that sacrifices a few thousands pixels to implement a more powerful and spectacular lighting engine will most probably produce a game that ultimately looks a lot better. Of course you can’t go too low in resolution before the visuals start to really degrade but it’s a trade off, and one of the most important tasks in development is hitting the right balance.

Another very easy example to envision is draw distance. In most last generation games that see you travel exterior environments (and some time interiors too), you’ll notice that far away objects are hidden from view by some sort of mist, called Distance Fog. Its purpose is exactly to limit the draw distance so that the hardware has to render less models and do less work, keeping a decent frame rate and level of stability.

Looking at the footage of many next generation games, that Distance Fog has been either removed or placed much farther away. This should not be underestimated, as those objects need to be rendered by the hardware, and result in a much higher overall polygon count and many more textures that need to be stored in the memory.

The less theoretical effect is that the vistas we’ll be able to enjoy will be much bigger and more beautiful, letting us see much father away, like we do in the real world. That’s most definitely not a small graphical improvement by any stretch of the imagination.


The same could be said about many other elements and effects, that can’t be easily expressed in number like “1080p, 60 fps,” but are often more relevant than raw resolution and frame rate to the final result you’ll see in front of your eyes. And yes, developers will most possibly sacrifice resolution and/or frame rate in order to implement elements and effects that, in the end, simply make the game look better, numbers be damned.

I hear quite often people saying that a game is “not next gen!” when it doesn’t hit 1080p and 60 fps, but “next gen” isn’t defined just by numbers. While the hardware resources provided by the upcoming PS4 and Xbox One can’t compete with a top-end gaming PC that’ll easily cost you three or more time as much (and why should they, really? you get what you pay for), they do provide a very steep increase in performance, computational and rendering power compared to the PS3 and the Xbox 360.

At times all that raw power will go towards resolution and framerate, but many times it will be invested in having bigger worlds in which you can see much farther, have better lighting that makes everything looks more natural and “alive,” fight smarter enemies designed to challenge us much more effectively, and so forth.

1080p and 60 FPS look very good on paper, but we don’t play papers. We play often extremely complex entities called games, that are made by a lot more than simple numbers. Maybe it’s time to look at the whole picture instead, and be excited again. I guarantee that it’ll make the beginning of the next generation more enjoyable for us all.

Join the Discussion

  • Axe99

    Well said, everyone’s got a bit resolution/frame-rate happy of late. I think it’s less because of a fixation on the numbers, and more of a fixation on which next-gen machine has the graphics edge on the other (which I thought was a foregone conclusion by now – could we not move on?) Only a few more weeks and hopefully we’ll all be too busy playing next-gen games to obsess over this stuff ;).

  • Sheldon Prescott

    I don’t get why so many people are butthurt over EVERY GAME not being 1080p/60fps. Like this article says. 60fps is nice in fast paced, twitch reflex type games, but it’s not absolutely needed in ALL games. People were actually complaining that Dead Rising 3 isn’t 60fps -_-. Really? You need 60fps in a game like that? Isn’t the game doing enough as is?

    • foureyes oni

      ? i thought the last talk of fps for dead rising 3 was that it was struggling to get a solid 30fps. I do agree that the game doesn’t need 60fps.

      • Xbox Owner

        DR3 can handle 1000 zombies at a screen at a time, each one different looking!. Killzone Can handle 24 looking the exact same!

        • Kyoto Region

          You really don’t understand the concept of complex AI do you?

    • andrewskaterrr

      I have to have 60+fps no matter what game. It’s too blurry and laggy otherwise. 30fps feels like im in slow motion

  • Bozo Sapien

    Good luck trying to explain this to NeoGAF.

    • Giuseppe Nelva

      I’ll pass 😀

      • Guest

        And so you should, they (and neither are the people at N4G where this is posted) are not worth it.

  • da Boss

    I would trade 720p/60fps for 1080/30fps anyday. Most game are 30fps anyways, so i’m used to it, but i am absolutely tired of the sub-par HD resolutions. If we’re to have any prayer of 2160p gaming, we’ve got to raise the bar already. Is this next gen or what?

    • Giuseppe Nelva

      Next gen isn’t defined by numbers.

      • wombateer

        Next-gen will be defined by creativity.

        • TallestGargoyle

          Shooter, racing, shooter, remake, shooter, shooter, racing, sports, LIVE TV ON YOUR CONSOLE!!!

      • da Boss

        There needs to be standards, 720p is unacceptable.

    • Minecraft Greek

      720/60 is 55,296,000 pixels per second. 1080/30 is 62,208,000 pixels per second. Unfortunately 1080/30 is still 12.5% more demanding than 720/60. The game would have to be balanced to accept a 12.5% differential, meaning if they gave you the option between the two, they would have to make the 720/60 version use less system resources so the 1080/30 version could run smoothly.

      Interestingly enough 900/30 is 43,200,000 which is a great compromise of higher than 720 resolution and lower fps, 720/60 needs about 28% more resources than 900/30. That might explain the want to go 900/30 on a lot of games. It might just be that sweet spot of HD resolution and lower frame-rate to squeeze a lot more detail into a scene.

      900/60 however is 86,400,000 pixels per second. If games are actually going for 900/60 they are using 38% more resources than 1080/30. If they can actually hit 900/60, than it might be a good compromise too, better than 720 HD, but smooth framerates.

      It really all depends on what the devs are able to push firstly, and then what kinds of trade-offs they want to make to get there. I believe if 720/60 is likely, than 900/30 is even more likely as well. Any game running at 1080/60 is really going to be sacrificing something else visually.

      Forza is a good example. It’s a very simple stationary environment with limited objects moving in that space. I would bet that they targeted 1080/60 at all costs, but honestly, it’s a racing game and the scene is flying by you, it’s better to see it sharply than increase the poly counts on an object that you will see for 1/2 a second.

      Ryse is a good example. Running at 900P (30 or 60?) it is sacrificing some resolution to save power for more environmental effects, objects and lighting etc.

      In the end it is all up to the developer to set a bar, and decide what it will take to reach it.

  • Josh Hanes

    Sigh. 1080 > 720p it doesn’t matter how you try to coat it. 60fps or 30fps is an actual design choice, as proven by the range of 30=60 fps titles we have had for the last 2 decades. 720p over 1080p is just gimped.

    • Giuseppe Nelva

      Sorry but no. 720p with more and better effects, longer draw distance, better antialiasing, shadows and lighting looks better than 1080p without those perks any day. A slightly (due to the upscale) sharper picture does not outweigh all the rest.

      It’s all a matter of how resources are utilized.

      I’m afraid that your idea of “gimped” is oversimplified, like the word itself.

    • Minecraft Greek

      If you have to sacrifice detail in other places than your argument is wrong. If you have crappy lighing, shadows, reflections, particle effects, character models, texture resolution, shader effects, displacement maps, draw distances, polygon counts etc….than 1080P will be a sharper-looking crappy lighing, shadows, reflections, particle effects, character models, texture resolution, shader effects, displacement maps, draw distances, polygon counts etc.

      Merely drawing natively at a higher resolution doesn’t instantly make a game better.

      Would I rather have 1080P if it was at no cost to other parts of the graphics, yes, who wouldn’t. But since it’s up to the devs really how they utilize system resources, it seems devs are more satisfied with sacrificing a little resolution in favor of a better overall graphical experience.

    • Guest

      You have no clue what programming a game is like, just like most of the idiots at NeoGAF and N4G.

  • Slay

    Xbox will slay again this year.

    • Slay’sCommentHistory

      Great value? No.

      • DarkMaturus

        The value will get even sweeter soon. I feel it!

  • You are flat out wrong

    If most of your games aren’t 1080p straight out of the gate, it seriously compromises your ability to call your console “future-proofed,” and significantly compromises my confidence in putting time and money into a system that could become obsolete in a few years. This is why things like this are important.

    • willhe

      i dont think nobody ever asked them what their vision of the future was. everyone has a different vision of the future for gaming. future proofing would have been waiting to put in hdmi 2.0 for 4k

      • You are flat out wrong

        Well the future sure as hell isn’t wagglecam.

        • willhe

          sony thought it had enough legs to make a new one. stupid to think that all you think people will do with is waggle. i actually like the voice and recognition stuff and kinect fitness is far from waggling.

      • TallestGargoyle

        HDMI 2.0 wouldn’t be the only factor in 4K. Considering the graphics cards in both consoles are barely capable of 1080p 60 FPS, they would suffer IMMENSELY from even trying to handle anything at 4K. Video playback is simple enough, you don’t need super spectacular hardware to do that. But gaming? Even PCs without dual-graphics setups struggle with 4K gaming.

  • HalfBlackCanadian

    Great article.
    Every time I see resolution wars in comment sections or articles I think of my Genesis Aladdin and SNES Aladdin. Genesis absolutely had the weaker hardware but arguably the better looking game. SNES killed it in the sound department though. Game play is a taste thing (my wife swears by the SNES version… we agree to disagree)

    • willhe

      when you hear talks like this its deceiving. nintendo had a choke hold on the industry. only broken by sony because it shifted to cd format. same with ps2. the fanbase was well established. most people stick with one console until they stop getting support

  • ps3

    Ps4 will have lots of 1080p 60cps games, xbox one will have more 720p 30 fps games and only a few 1080p 60fps games. I seen ps4 game videos at true 1080p native hd at 60fps, looks insane. I cant wait to play ps5 in 2020 with 4k uhd games.

    • andrewskaterrr

      So misled. Oh btw, if you wanna see “insane” graphics, come to the pc side, we have lots of them.

      • RoadShow

        Like what????? Crisis is about the only game that has ever shown PC advantage. Other than that it’s pretty minor, at least for the first couple years of a new console launch.

        • andrewskaterrr

          Are you serious? Just about every multiplat looks better on pc. Do you not remember BF3? That was the “can it run?” game until Crysis 3 came out 2 years later. BF4, Metro games, Arkham games, Hitman, heck even Skyrim (even though it’s not hard to run) have all been benchmark games that blow the console out of the water. You can go back even further. Call of Duty games, Valve games, Star Wars Battlefronts, seriously there’s only a couple of games that were terrible ports like GTA4 but they still looked better than the console.
          Even the new consoles can’t touch those “last gen” games on pc (consoles will never stand up to pc hardware obviously). Maybe you just haven’t seen a high end pc, I run 3570k @4.4ghz and SLI 670s with triple screen and my main monitor is 144hz. Consoles’ 30 and 60fps doesn’t cut it for me. You can do a side by side or even not a side by side and see a big difference in graphics between maxed pc vs this gen consoles. And now that the console ceiling has been lifted pc games will get even better.

          • RoadShow

            Wow you got problems kid.


            No, it was released in 2011, 6 years after 360 and 5 years after ps3 launch.

            Even Battlefield 4 is a bad example since that game was developed before final hardware specs were confirmed. Also they haven’t been able to spend much time with the new hardware and get used to it and draw all it’s power.

            HOWEVER BF4 still came up at 900p, 60fps on single player and 64 player online. Equiv. to a mid range gaming PC.

            Remember kid, BF3 on PS3/360 ran on only 500mb of ram.

          • andrewskaterrr

            LOL you obviously don’t know much about hardware if you think they “haven’t learned the hardware” yet. That was last gen. The PS4 is literally an AMD APU, it’s a mid range pc. They’ve maxed out the PS4 already, there’s no more power to gain.
            PS4 drops to the 40s in BF4, it’s not solid 60fps, it’s playable but don’t think it’s strong enough to pull solid 60.
            Ya I know it ran on 512mb shared between system and video.

          • RoadShow

            Whatever dude. I get far more out of my console than I do my gaming PC these days. Any more my gaming PC simply compliments my consoles as my home media server and the rare PC game I care about.

          • andrewskaterrr

            See I’m the opposite. I never touch my 360 and PS3 anymore. I get all my games for pc. My friend let me borrow The Last Of Us and I can barely play it because of the input lag and low fps. I also feel trapped when I’m on it. I have to use my main screen for console then have my 2 other screens showing my pc windows and chats so I’m still connected to the world. Also the only console games I care about are Wii U games, and that’s why I bought one, and it’s the only one I’m buying. Great party games like Super Smash Bros, Mario 3D World, Mario Kart, Donkey Kong TF. Although pc has Broforce with is my #1 party game.

          • RoadShow

            I’m going to get a WiiU as well just for a handful of games. Once the price drops and or storage size increases. Kirby, Mario, Zelda & Metroid.

            Love my PS4, very fast. I do find it odd you didn’t like the last of us but to each his own I suppose.

          • andrewskaterrr

            I have the 4Gb model. I bought a 16GB usb stick that’s super tiny and popped it in. Didn’t even need it. Games saves and demos are the only things that save to it. I just use the internal and back up to the usb every once in a while. Oh and you can use 2 usb storage devices at the same time and the max size is 2TB but I don’t know if that’s just per drive or total. Just get the $250 Wii U.

          • RoadShow

            Yeah I’m still going to wait until Black Friday. Just spent $520 on my PS4.

            And then it will be about time to upgrade my graphics card.

          • andrewskaterrr

            How did you spend $520 on a PS4? accessories?

          • Raven Nafariea

            When I bought mine I spent an extra hundred on a new HD and swapped it out for a 1TB Hybrid SSD Drive. Then bought an extra controller so I spent a little over 620.00. It sure can add up with accessories and upgrading that is for sure. lol

          • andrewskaterrr

            Ya, when I bought my Wii U, I forgot how much consoles cost. I was spending $50-60 on games and buying extra controllers. I haven’t done that in forever.

          • Raven Nafariea

            Yeah, it can be expensive that is for sure. If you are a PC gamer than even more expensive. 620.00 is a drop in the bucket compared to PC gaming. lol

          • andrewskaterrr

            Eh depends. I get all my games on sale, even release day I get $15-20% off. Buuuut I have chosen to spend over $2000 on my pc, monitors, and games, but I have over 230 games. I bought a second GTX 670 and a water cooler instead of buying a PS4 or Wii U, then I went back and bought the Wii U anyways lol

          • Raven Nafariea

            Wii U over a PS4? … Okays =P I just built a new PC a few months back… 3,000 bucks! I went all out when I didn’t even need to. lol That was right after I bought my PS4 also. I thought I would be gaming on the PC way more but so far have been gaming on the PS4 almost non-stop. I dunno what I was thinking. lol

          • andrewskaterrr

            If you spent 3k then you better have a 4770k, sli 780tis, 32gbs of ram, gold or platinum psu, and custom water loop.
            And yes, the only games I care for on PS are Killzone (heard 4 was a flop) Uncharted (not out yet), and Resistance (game is dead). So 2 games isn’t worth $400+. All the games I want come out on pc and Wii U

          • Raven Nafariea

            i7 4820k, 64 GB’s of DDR3 2133 Ram – EVGA 760 FTW 4-GB GPU – 500GB SSD Main Hard Drive. I also have 5 – 2TB HD’s installed and bought a new Cooler Master gaming case to accommodate the larger sized Motherboard – EVGA Dark x79, new fans, new Blu-ray drive/burner, 1000 WT Corsair power supply. Build it one day, outdated the next… lol And, okay makes sense than. =)

          • andrewskaterrr

            I was thinking about the 4820k myself today but I’m gonna just gonna wait for broadwell. and 64gbs is so overkill lol

          • Raven Nafariea

            I highly recommend the 4820k because you can overclock the hell out of it very easily but if I were you i would wait also, so good call. 64 is a big but thought why the hell not and maxed it out. To be honest I will never use it all but it is nice to know I have it. lol

          • andrewskaterrr

            I was saw the benches and the 4820k is ivy-e and i already have ivy so I don’t want to stay in the same socket. when compared withe the 4770k it goes either way with performance. i do like 2011 cause the lids are soldered so better heat transfer. but i think im gonna go broadwell or x99

          • RoadShow

            PS4, Killzone & Battlefield 4!

            Heck I still need to come up with money for Battlefield Premium. Feel like a bum but have just had other priorities since I spoiled the hell out of myself.

          • andrewskaterrr

            I don’t like BF4. I kinda regret buying Premium. BF3 is better.

          • RoadShow

            Did you play 3 on PC? At first I didn’t
            like BF4 on PS4. I used to play Battlefield 2 on PC (not bad company) and hated
            64 players. It’s just that I’m in it for the vehicles and 64 players you are
            literally walking over each other and spend the whole map trying to get
            vehicles. I personally prefer 32 players. I haven’t had a big squad yet on BF4
            so that would help with 64 players but even then I would never get to have a

            BF4 is great now
            though. I certainly miss some things about 3 and I do still hate that the
            weapons have to cycle again especially in a 64 player map. Freaking over there
            killing tons of people, weapons cycle and you have 6 dudes on you from all
            direction. I’m all about my kill to death and it went from 8/1 in BF3 to 3/1 in
            BF4 (overall). I love the evolution though and the boats and being able to
            capture video is a lot more fun than I thought it would be.

          • andrewskaterrr

            I played over 250hrs on xbox, then built my pc and i’ve got 229hrs on pc. I like 64 player but 32 is more stealthy cause they’re less people. In Bf4 i would get spawn killed constantly and my k/d was through the floor. I’d but an entire clip into someone then they’d turn around and kill my in 2 bullets, and other times they would die. Didn’t make any sense and I always join servers that are under 60 ping

          • RoadShow

            On PS4 I haven’t had too many issues. I waited until February to even bother with it though as I knew it would be a buggy mess at launch. Even then I did experience some servers that stuttered horribly. That’s probably the biggest issue I had but it’s been long gone now. I’ve questioned hit detection a few times.

          • andrewskaterrr

            I know BF4 has bad input lag, around 74ms compared to crysis 3’s 56ms and counter strikes 23ms. I think this and server lag may have something to do with it

    • Kreten

      That’s funy because theres not a single PS4 game that’s 1080p60 lol KillZone multiplayer is 1080p 45fps

      • RoadShow

        LOL don’t you feel stupid now.

  • PrinceHeir

    for me i just want 60FPS so that i don’t see any noticeable slowdowns and such.

    don’t really care for 1080p, most of 1080p benefits are for movies and anime BD’s and such.

    • andrewskaterrr

      What? 1080p means less jaggies better smoothness, more quality. Makes it look more real.

      • Kreten

        Depends if your tv is 720p and how big your tv is and how dense the pixels are etc etc. I’ll bet you that even 1080p will look better on my iMac retina than your regular led/lcd monitor. Its not just numbers. As we see from BF4 720p vs 900p and better scaler engine vs worse one produce equal results with one looking better in some areas and the other better in others

        • andrewskaterrr

          Eh idk. Using the same res as your content is the best, unless you’re using upscaling or AA.

          • Kreten

            I know I’m just saying about consoles as they display all the games to 1080p so it’s either native or upscaled. But if I told you to pick 900p60 fps or 1080p 30fps your first question should be what kind of game. Because fast paced shooters could benefit more from 30 extra fps than having less stretched out resolution. But even 1080p vs 1080p will vary from TV to TV. I hope that no one has Westinghouse or Dynex tv and trying to compare resolution vs person with Panasonic/Samsung/LG LOL

            I will always pick 720p on my Panasonic/LG than have 1080p on Dynex/Westinghouse 😀

          • andrewskaterrr

            I would take 60fps no matter what. 30 is a slideshow.
            You’re taking into account the display’s actual light output: color, contrast, brightness, lumens. This is about media quality. Resolution is resolution. a 1080p pany vs 1080p dynex will have the same pixel count, Now the Pany will look better when it comes to colors because it’s a better panel but that has nothing to do with the source material’s resolution. Oh btw samsung panels are nasty, overly saturated and contrasted. Apex beats them.

  • spid3r6

    An thus is the reason why Ryse looks better than Killzone…By the way the pretty picture at E3 is not longer pretty on the actual PS4.

  • Antonio C. R. Murray

    “We cannot promise that the Wii U will never be excluded from multi-platform software for eternity, but we can at least assure you that the Wii U will not have such a big difference as the Wii had in comparison to how, on other platforms, developers could expect very different graphic capabilities of generating HD-applicable high-resolution graphics” – Satoru Iwata / 2012

    I said it before when that statement was made. If these “next gen” (bullshit term since it was created by Sony at E3 1995 to differentiate PlayStation from already existing 5th gen platforms 3DO and Amiga 32) systems from Sony and/or Microsoft don’t at least do 4K, then there ain’t really nothing to talk about. Every GPU from Nvdia and AMD post E3 2012 is capable of 4K. This is the result of a Kutaragi-less Sony. The iconic term “quantum leap in technology” even isn’t used anymore.

    • andrewskaterrr

      They’re capable of 4K yes, but have you seen the frame rates? They’re garbage. There’s no need for 4K on these consoles because 1. they aren’t strong enough 2. Most of the dumb market can’t tell 720 from 1080. Over 1080 is pointless for the consumer market. Even years down the road 1080p with AA will be just perfect for people. I have a gtx 670 and i’m scared of moving up from my 1080p to 1440p because my card will take a dump up there.

  • Steve Denninger

    Fanboys think 1080p is king. But i’d rather have 60fps for a game that needs it instead of 30fps. Most of them are just repeating what they hear on those trash forums and don’t understand you can have a better looking game at 720p than 1080p. It’s all about how they balance the game. 1080p doesnt mean you’re getting the best visuals.

  • Minecraft Greek

    Until graphical capabilities can render a 480P scene that rivals real video footage, any larger resolution will always be a graphical compromise. I’m sure developers are finding a sweet spot to get their game to look how they desire. Somewhere between 720P and 900P may be it…..or we might see sub 720P games, and they may look even more spectacular.

    If they can really push the realism in a scene and create a better picture at a lesser resolution, I’m all for it. No matter what, you could always push a games graphics to a point where you have to compromise on something. Even if next gen consoles were all 10TF machines, someone could still say they wanted more this or that and have to sacrifice resolution, draw distance, polygon counts, shaders, texture resolution, lighting, framerate…etc.

    • SteveS

      absolutely correct, dude…I am sorry that the gaf people aren´t that smart. They just don´t want to understand. Hopefully there are guys like you, intelligent enough to get these things

  • RoadShow

    People need to stop freaking out about frames per second and resolution. These are the 1st games of a new console generation who’s games were created before final hardware specs were known.

    (1) Don’t judge a game until it releases (2) Just realize that developers 2nd, 3rd and continuing games will do nothing but get better and better all generation which include better frames per second and better resolution. 1080p & 60fps will be easier and easier to achieve.

    • TallestGargoyle

      Final specs would have been finalised a looooong time before launch. Just because the consumers don’t know the final specs, doesn’t mean the developers don’t as well.

      • RoadShow

        Actually no. The dev kits started at 2gb, then were 4gb and in the end at the very last minute Mark gathered all his data and sold Sony on going 8gb GDDR5. Look it up, it’s a good read.

  • sachz

    In 2016 many games will be happily running 1080p 60fps on the ps4/xbone and ur article will be redundant and just plain wrong…. its early days of next gen, the early days of ps3 were crap compared to 2013 ps3 games quality. look forward and think outside of the box.

    • andrewskaterrr

      that will only happen if devs get smart and turn down other effects like shaders, shadows, textures, ambient occlusion, etc.

  • TalesOfBS

    Maybe this is why nintendo is sticking to 720p even in non resource intensive games?

  • jeff

    720p upscaled looks like garbage, no matter how much AA get’s thrown on top. I don’t care how many dust particles or light rays or whatever is going on, I just don’t wanna see jaggies on top of scaling artifacts on top of more jaggies for the next 10 years just because Microsoft decided kinekt was more important than competent modern hardware.

    • Kreten

      The point that you don’t get is that even if you give Devs a 10TF machine you will still have 720p games just because 9/10 prefer AAA over framerate.

  • Kreten

    I gotta say that this is one of very few articles that actually made sense and is true in past few months. Considering that 99% of all gaming articles originate from some GAF made up story. I often point people to Super Mario and ask them if they think that it’s better looking game than KZ or Ryse because it features full 1080p60 lol

    Good Job!

    • TallestGargoyle

      No but it plays nicer as a result. Responsive controls and visible dangers are a great boon.

      And to be honest, I’d rather the shiny, smooth, stylised graphics of the Mushroom Kingdom over grungy ‘photorealistic’ trash any day.

  • Travis Touchdown

    Can’t we all just agree that the PC is perfect for modders and people interested in development, and that the Wii U and the 3DS are the true gaming Master Race?

    • TallestGargoyle

      I would if the Wii U had any major games. It’s taken them well over a year to even start releasing anything substantial, and most of that is just teaser stuff in the Nintendo Directs.

  • Corey

    I agree. However…..something that can do all those things…..and output at 1080p60 frames is clearly better. In this case the PS4 seems to most be quite a bit better. That being said…I’m still getting an Xbox One. Can’t stand those Asian controllers.