News, Platforms, PS4

Sucker Punch Seeking More Ways To Use PS4′s RAM; CPU a Bottleneck but There’s Room for Improvement

by on April 14, 2014 7:06 PM 179

At the Games Developers Conference Sucker Punch Lead Engine Programmer Adam Bentley held a panel titled “inFAMOUS: Second Son Engine Postmortem” and the notes of the new version of the slides published today include some very interesting details about how the PS4′s hardware works and on how it was used for the game.

  • The game has very large draw calls and object counts, but it wasn’t a problem.
  • Having enough memory is “incredibly helpful” to make working on the game faster and simpler.
  • The team came “close to the limit” of memory usage, but there are hundreds of MB of leeway left. They’re still coming up with useful ways to use the memory available on the console.

iNFAMOUS_02

  • The jobs are much more complex than they were in the previous inFAMOUS games.
  • The CPU of the PS4 worked pretty well, but it’s still one of the main bottlenecks. It’s also less easy to optimize.
  • There’s room for improvement in the job system using the CPU’s multiple threads and in other areas
  • Material properties are stored in up to 8 gbuffers (5-6 plus depth/stencil), with 41 bytes written per pixel. That translates to 85 MB for full screen buffers. Bentley’s comment is that it’s “good that the PS4 has a huge amount of fast RAM.”
  • Seattle is often overcast, so indirect diffuse lighting was very important, and ended up a bigger deal visually than initially thought. Lightmaps were considered, but discarded because they would have taken 12 MB per block just for the UVs, without even considering textures.
  • Seattle is also often wet, so indirect specular lighting was just as important. Specular probes are 256×256 pixels and there are up to 256 of them at any given time. That’s roughly 175 MB of RAM used.
  • The compute shader size for particles can be over 15 MB of loaded data in memory.

Considering the amazing results achieved with inFAMOUS: Second Son, we’re left wondering on what the studio will be able to do once they improve CPU and memory usage further. We’re near the beginning of the generation, and things are definitely looking good.

 

Join the Discussion

  • Jack Joyce

    Thank god for Direct X12 and HOPEFULLY Open GL getting Multi Core hyper threading. Because we need to eliminate the CPU Bottleneck across all platforms.

    • Gannicus

      opengl will already be multithreaded and all the stuff DX12 will be doing opengl already does it. Opengl is open source which means it can move along a lot faster than DX cus that’s purely down to MS making the effort. Which is why opengl is substantially faster than DX11

      • Jack Joyce

        Open GL doesn’t do Multi Core threading. YET
        Otherwise there wouldn’t be any CPU bottlenecks for devs. It’s still an issue and will be until open GL gets the update.
        And DX11 is 6 years old.

        • Gannicus

          well that doesn’t make any sense, surly an open standard would move a lot faster than DX? unless there waiting to copy and paste the code into it

          • Vious

            copy and paste. if after DX12 is released and we see openGL with multicore perf increase then surely we can correlate that data

          • Gannicus

            I don’t think it matters that much tbh, I think infamous was getting let down not by the gpu side but cpu processing of AI, etc xtreme derp (major fanboy) says that openworld is more stressful on the cpu so even if opengl isn’t as multithreaded the game engine will be, so it can spread AI and stuff loads across multiple cores so cpu stilla bottleneck. I don’t think watch dogs on either console will have as a high a population density in that case as lots of AI going on but we’ll see

          • Vious

            the only game I’ve seen with a ton load of AI is dead rising 3 and it’s open world (real open world = as in you can basically go in any building and run around in it) with literally everything interactive.
            that’s a launch game on the xbox one.
            -
            that being said, if they can do it in dead rising then i’m not sure why they wouldn’t be able to do it in watch dogs….
            anyway, it’s an interesting game, I don’t think i’d play it for the story as much as i’d play it to hack in other people’s game and mess around….

          • Gannicus

            maybe a lot of other factors stressing the system and AI maybe more complex in watchdogs as mindless zombies wont take much AI at all. getting in other ppl games seems aesome

          • Landsharkk

            “Mindless Zombies” are being run by AI. When you code an NPC, they aren’t “mindless zombies” by default, you have to add that AI into the game.

            What do NPC’s do in Watch Dogs? Walk around aimless and interact with other NPC’s/the main character, right?

            Well, that’s the same thing the Zombie/NPC’s do in Dead Rising 3.

          • Gannicus

            well if the AI isn’t that complex and its bogging down the CPU then they’ve chosen the wrong cpu to put in. I think the 4+4core design of sony CPU is coming back to bite them in the @ss, they say its really fast… with in the 4 core blocks yeah if you send data to the other 4 core block its very slow

        • Guy Brohski

          DX11 isn’t 6 years old, lol. The first DX11 GPUs were released in LATE 2009. It’s not even 5 years old yet.

        • Stranger On The Road

          There is more to the game than DX and OpenGL draw calls, the CPU would be a bottleneck for many other reasons. If DX is the only thing that was overloading the CPU then we won’t need to upgrade our CPUs when doing none gaming tasks.

      • Stranger On The Road

        small correction.OpenGL is not open source, but an industrial standard, this is why Mesa is call Mesa and not OpenGL; they have to pay for.. something [can't remember it now] in order to become an official OpenGL implementation.

        • Gannicus

          ah ok fair enough but opengl should still move faster than DX though implementation wise

          • mars mayflower

            faster != better

      • Ps4 sucks

        Direct x is so much better than open GL u need to stop acting like a fan boy

        • Gannicus

          DX is not good actually, you need to stop being a muppet. Its good as ease of use but to get the most out of a GPU in a high end game it is not good, why do ya think lots of developers asked AMD to make mantle?

        • Hugo Stiglitz

          LOL is that why games have been proven to run better on Open GL than DX?

          Google it

          • Failz

            Is that why DX is the most used API in the world?

    • Xtreme Derp

      OpenGL and PS4′s API already have pretty good CPU multi-threading support.

      http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014

  • Gannicus

    I thought there wasn’t any bottlenecks in the PS system? maybe the GDDR5 is causing latency issues with the CPU and the 256MB flash ram is only enough to handle OS calls

    • Giuseppe Nelva

      Every system has bottlenecks, big or small. And you have it the other way around.

      • Gannicus

        the cpu isn’t fast enough to cope with the requests from the GPU or someat?

        • Bankai

          There will likely never be a piece of tech that has zero bottlenecks.

          • Gannicus

            yeah that is true but when everyone claims it has no bottlenecks then turns out its a lie then other stuff should be questioned to in my opinion, loosely like the police you give a statement as fact, part of that statement turns out to be a lie then you have to question the rest really

          • Jeff Pee

            I think some of those statements could be attributed to very happy devs who spent years working on the PS3/360… which when you compare the PS4/X-1 to those systems are pretty bottleneck free.

            But who knows, I’m not a dev. Also, are any of these early games even using multiple threads?

          • Gannicus

            yeah that is true does seem to free up a lot until they want to push it further than it can actually do

          • cozomel

            Dude, seriously, you’re doing it again. You’re acting stupid. No one ever said the PS4 doesnt have any bottlenecks, just less of the usual ones. And the “devs” you’re talking about meant, they didnt run into much, if any bottlenecks in their games. Which were almost all smaller scale games, but go ask DICE if its got any bottlenecks and they’ll tell you. But you’re being stupid not understanding. The CPU is a bottleneck for both consoles. Plus MS got a whole lot more bottlenecks to worry about (like the weaker GPU, and worst memory). Now get a grip.

          • Gannicus

            well then articles shouldn’t go spreading misinformation then cus everyone will think theres no bottlenecks whatsoever when there clearly is so yeah that’s all im saying

          • NeoTechni

            No one claims it has zero. Just a lot less than x1

          • Gannicus

            they did actually, when you say it has no bottlenecks that’s means 0… if you say it has very little bottlenecks it opens it up to being a few etc

        • Arnold Stallone

          Nobody can deny these 1.6GHz apus aren’t the fastest chips ever made. Of course, speed isn’t everything and each manufacturer has some special custom specifications. A simple 3770k at 3.9GHz would have been great.

          I blame people and gamers for these low specs next gen consoles, even if games are and will get better with time.

          With all the people whining about prices, they were forced to make cheap consoles, to please everybody. Also, competition just killed my graphics.
          If there wasn’t Microsoft, Sony would have chosen a much much faster console/architecture/ components, and at the end, 500,600,700$, people would still get one.
          Hey, people buy new Iphones, iPads, Samsung s4, xperias, etc, and are OK to pay 600$ every 9-12 months. Competition doesn’t care about saving, and each one use the best, fastest components they have, and people still buy them.

          If Sony was alone, maybe with just Nintendo, they would create a super computer, not as complicated as a cell, but at least, they would have used +3ghz processors and high-end graphic chips. They would sell them for 600$ or more, and maybe more PC gamers would join the party, because graphics would be 4-5 times better than a kill zone shadow fall, for example.

          But because of the ‘ we must be cheaper than X’, and all the whining boys that would like to buy a ps4 or Xbox one for 49$ only, when they are OK to spend 2000$ on a PC, Sony and Xbox had to make huge compromises. And there we have, these ultra powerful 1.6 GHz toys.

          I would happily save during 12 months and buy a 700$ ps4, with 10 times the actual processing power, than having it for 399$, and seeing games and devs that must take big critical decisions on what to do, visuals, lighting, etc.

          And these guys whining and talking about a global crisis, are the ones who will pay 700$ for the iPhone 5s, because their iPhone is ‘too slow and old’.

          I wish Sony had made 2 versions of the ps4, one at 399$, and another, at 799$, with a titan-like chip, 16gb of gddr5 and a 4ghz CPU. Games would run like they do now, on the cheap version, and those who could afford to buy the 799$ model, they would be able to play the game with permanent 60fps on all games, 1080p at ultra settings, with some crazy anti aliasing msaa8x, and perfect crisp textures.

          Damn, I am sure +70% of buyers would get the expensive version.
          Just like they buy the best and fastest smartphone or tablet, they would get the best ps4 model.

          I’m a pro Sony, and I don’t like what Microsoft has done to the gaming industry( almost nothing excepting halo, cod exclusive content , gears, and stupid anti consumer policies, anti-indie shiity policies like releasing on all platforms at the same time, asking for parity, paying off reviewers, websites, everybody and his dog). But if Sony starts making fps games only, online only shiit, like elder scrolls online, etc, and less and less games are playable locally, or are full of micro transactions craap, etc, I think I will move to PC, in 2015, whenever I can have a 4k LCD, and can connect a computer to it, and play games with a gamepad.

          Please, Sony, start making tons of great single player games. I don’t care about f2p shiit. I want a great single payer story and game, I don’t want my games to be 90% online gaming only.

          I just wish there was the level of optimization on the PC side, like there is on consoles. And where are the PC games that use 10-15 GB of ram, full directx11, perfect multi threading processing, etc, at ultra?
          Damn, so much power available on computers, and games barely use 3 or 4 GB of ram. Isn’t it a shame?
          It looks like nvidia, Intel, etc, are retaining power, and are locking raw power with their drivers, so the only solution for better graphics is upgrading his graphic card. When, via some good drivers, they can give gamers the same gains in power as replacing the hardware(nvidia response to counter mantle: why they didn’t do it before? liars, thieves.really).

          1.6ghz… c’mon…

          • Vious

            so in all that you just posted you’re blaming it on Microsoft for sony to put a 1.6Ghz processor in it’s system?
            Microsoft has theirs running at 1.75Ghz (why didn’t sony?)
            the xbox runs pretty cool too, i’m sure they could up it to 2Ghz and still be in a thermal window that the cooling system can work fine with.
            -
            I seriously can’t believe you’re blaming competition for sony’s choice of hardware.

          • Gannicus

            look at my post above or below says it all some ppl havnt got a clue

          • Michael Norris

            Sony’s Cpu is actually slightly better,it’s ether clocked higher or Xbone has more overhead.

          • Vious

            ok, before you try to correct someone in a fanboy fashion at least try know what you’re talking about.
            here are some fun facts you can go check up yourself:
            fact: ps4 cpu clocked 1.6Ghz
            fact: xO cpu clocked 1.75Ghz
            fact: ps4 cpu does 16 ops/cycle
            fact: xO cpu does 48 ops/cycle
            -
            fact: DX12 will introduce more efficient cpu utilization
            rumor: openGL will get the same.

          • NeoTechni

            You made up some of that

          • Vious

            which part did I make up?
            tell me?

          • NeoTechni

            fact: ps4 cpu does 16 ops/cycle
            fact: xO cpu does 48 ops/cycle

          • Vious
          • NeoTechni

            You made 2 claims. Back both please.

            http://m.neogaf.com/showthread.php?t=737629&page=1

            according to benchmarks published on GamingBolt the PS4 CPU is faster than the Xbox One

            Both systems have 8 core Jaguar based CPUs from AMD, but prior to release Microsoft was promoting the fact that their CPU was clocked higher at 1.75Ghz. The benchmarks here imply the PS4 CPU is actually running at 2Ghz in order to produce 14 MB/s versus 12 MB/s for the Xbox One. The other possibility is that these figures are for the whole CPU and not a single core as labelled. In that case it would imply PS4 is using 7 cores at 1.75Ghz versus 6 cores at the same frequency.

            Since this tech is purely algorithmic on CPU and not boiund by bandwidth we can’t look to the PS4′s GPU or GDDR5 to explain the difference. The only logical conclusion is the PS4 has a faster CPU, despite Microsoft’s protestations to the contrary.Sony has never officially disclosed the PS4′s CPU clockspeed.

          • Vious

            the benchmark done on sp4 dev kit which were unlocked up to 2+ghz
            here sony stated the ps4 running at 1.6Ghz https://plus.google.com/+sonyuk/posts/eiA6sDQvWwQ
            -
            searching for the video mark cerny said there was little to no change to the jaguar cores because amd did a great job, this saying the cpu is a regular 8 core jaguar. jaguar cores do 2 operations per cycle.

          • NeoTechni

            “the benchmark done on sp4 dev kit which were unlocked up to 2+ghz”

            Yes, and dev kits are always slower than the retail unit, not faster

          • Vious

            traditionally devkits are actually faster because it has to account for the extra activities that is going on when getting “telemetric” data from games running on with extra amounts of debug code.

          • NeoTechni

            No, traditionally they are slower cause they’re still developing the retail console.

            http://www.eurogamer.net/articles/digitalfoundry-2014-secret-developers-wii-u-the-inside-story

          • Vious

            and where in that article about the wii u are you trying to point me to?

          • NeoTechni

            The dev talks about how the kits got more and more powerful the closer it got to launch, eventually catching up with the retail specs.

            But the whole article is worth a read.

          • Xtreme Derp

            You’re almost completely wrong. As usual. Xbox does NOT have more ops/cycle, and it most certainly doesn’t run 3 times as fast.

            Digital Foundry: Is it essentially the Jaguar IP as is? Or did you customise it?

            Nick
            Baker: There had not been a two-cluster Jaguar configuration before
            Xbox One so there were things that had to be done in order to make that
            work. We wanted higher coherency between the GPU and the CPU so that was
            something that needed to be done, that touched a lot of the fabric
            around the CPU and then looking at how the Jaguar core implemented
            virtualisation, doing some tweaks there – but nothing fundamental to the
            ISA or adding instructions or adding instructions like that.

            Not all of those “6″ ops are fp.

            Durango
            CPU cores have dual x64 instruction decoders, so they can decode two
            instructions per cycle. On average, an x86 instruction is converted to
            1.7 micro-operations, and many common x64 instructions are converted to 1
            micro-operation. In the right conditions, the processor can
            simultaneously issue six micro-operations: a load, a store, two ALU, and
            two vector floating point. The core has corresponding pipelines: two
            identical 64-bit ALU pipelines, two 128-bit vector float pipelines (one
            with float multiply, one with float add), one load pipeline, and one
            store pipeline. A core can retire 2 micro-operations a cycle.

            Out of those six micro operations, 2 are floating point calculations (1 mult + 1 add)

            OpenGL and PS4′s API already have pretty good CPU multi-threading support. DirectX 12 is catching up not getting ahead.

            http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014

            PS4′s
            CPU performs better in benchmarks and according to devs, despite
            (apparently) being clocked slightly lower. It’s probably 1.6GHz but
            nobody but a PS4 “marketing firm” ever stated it, which isn’t exactly
            the best source.

            http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively
            http://www.neogaf.com/forum/showpost.php?p=94264594&postcount=50

            .

          • Xtreme Derp

            You’re almost completely wrong. As usual. Xbox does NOT have more ops/cycle, and it most certainly doesn’t run 3 times as fast.

            Digital Foundry: Is it essentially the Jaguar IP as is? Or did you customise it?

            Nick Baker: There had not been a two-cluster Jaguar configuration before Xbox One so there were things that had to be done in order to make that work. We wanted higher coherency between the GPU and the CPU so that was something that needed to be done, that touched a lot of the fabric around the CPU and then looking at how the Jaguar core implemented virtualisation, doing some tweaks there – but nothing fundamental to the ISA or adding instructions or adding instructions like that.

            Not all of those “6″ ops are fp.

            Durango CPU cores have dual x64 instruction decoders, so they can decode two instructions per cycle. On average, an x86 instruction is converted to 1.7 micro-operations, and many common x64 instructions are converted to 1 micro-operation. In the right conditions, the processor can simultaneously issue six micro-operations: a load, a store, two ALU, and two vector floating point. The core has corresponding pipelines: two identical 64-bit ALU pipelines, two 128-bit vector float pipelines (one with float multiply, one with float add), one load pipeline, and one store pipeline. A core can retire 2 micro-operations a cycle.

            Out of those six micro operations, 2 are floating point calculations (1 mult + 1 add)

            OpenGL and PS4′s API already have pretty good CPU multi-threading support. DirectX 12 is catching up not getting ahead.

            http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014

            PS4′s CPU performs better in benchmarks and according to devs, despite (apparently) being clocked slightly lower. It’s probably 1.6GHz but nobody but a PS4 “marketing firm” ever stated it, which isn’t exactly the best source.

            http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively
            http://www.neogaf.com/forum/showpost.php?p=94264594&postcount=50

          • NeoTechni

            Ps4s is underclocked from 2.something according to their fcc filing. They can undo that at any time, just like they did for psp.

          • Vious

            they can at anytime you say? my friend that’s wishful thinking. you should take what it at what it is 1.6Ghz. if they so upclock later then so be it.
            but having an underclocked CPU isn’t helping devs and it’s certainly not making games better.

          • NeoTechni

            “they can at anytime you say?”

            That’s what Sony said to the FCC, as they have to report operating frequencies

            “my friend that’s wishful thinking.”

            It’s exactly what they did with PSP as well.

            “but having an underclocked CPU isn’t helping devs and it’s certainly not making games better.”

            Then explain why they did it for PSP.

          • Vious

            as I said, we’ll see how that terns out in the future.

          • Arnold Stallone

            With the ps3, Sony ignored the competition, and they released a super powerful system(back in 2006 few PCs could top it), because they didn’t care about saving 1$ here and 2$ there.even at 600$(I paid like 1000$ with my currency), it still sold really well, and still does today.

            With the ps4, they did care about Microsoft, because they knew that a ps4 100 or 200$ more expensive than the Xbox 1, would be a suicide.so both Sony and Microsoft tried to be the cheapest one.
            And how to do it? By using cheap graphic chips and a good price/performance apu.but slow.
            One of my computers is a small fan less micro itx motherboard with a quad 1.6 GHz CPU, that I built and use with a 23″ touch monitor at full HD, as a jukebox. Windows 7 runs super slowly on it, even with 4 GB of ram.

            1.6, or 1.8ghz, that is really slow.

            Along with competition, there were all the people whining, because they didn’t want to spend more that 50$ on a ps4. Ffs, when you know a console will be released in 10 months, you start saving 10,20,40$ each month, so when it’s released, 400,500,600$ isn’t too much. But people front do it, even though they can do it.

            So yeah, because of Microsoft and the whining spoiled kids, Sony was forced to build the cheapest system as possible, instead of using much better techs.

            I really regret that Sony didn’t build a ps4 around the cell. I am sure they could have built a system with 8 cell processors, 8-12 GB of ram, 750gb HD, and a graphic chip 4-5 times faster than the one used on the ps4. Maybe they didn’t follow that route to be able to justify no backwards compatibility, because with a ps4 full of cells, emulating the ps3 would have been easy.
            Now that most Devs were used to the cell , it would have been really an easy transition from the ps3 to the ps4.

            So when is see that Devs are already fighting with the CPU, at this stage, I really wonder if we will have a uncharted 1 to 3 visuals upgrade on the ps4, or if Devs will only be able to extract 20-30% more juice than a kill zone shadow fall, for example.

          • Gannicus

            are you crazy, 16GB GDDR5 (substantially more than DDR3 and 16Gb DDR3 would cost like $150-200 at retail) 4Ghz desktop cpu and titan like GPU woul pushing about $1100 at least because intel are the fastest they know it that’s why they charge so much… NVidia are the fastest they know that’s why they charge so much (also think the first xbox couldn’t have price drop as much cus gfxchip supplied by NVidia and they refused to move on price).

            The PS4 prolly didn’t care what MS was doing in a way cus of the recession and the amount of money sony keeps hemaoeragging (or however ya spell it) and you cant compare to a $700 iPhone cus most ppl have it on contract spreading the cost. Sony/MS already giveyou the console at cost price they get there money back from every game and peripheral you buy it has a console tax attached why do you think PC watchdosg = £30 and PS4+X1 watch dogs = £48… its a massive difference. Don’t get fooled by the price of the consoles if they sold it you for a retail price and charge less for games they wouldn’t make as much money back so, neither are innocent and the more you believe the lie. Just be abit more aware of whats going on mate…

            PC only costs so much cus its all up front 290′s supplied and build by AMD so they were cheapish, then 3rd partis could add there own coolers and stuff addig like £40-80 pending if it was factory overclocked and stuff. and then they need sell it to the rwtilers etc so a PC component is always upfront cost and save on the games and software etc plus it does a lot more to.
            rambled on a bit ill cut it off

          • Matt Dickinson

            People buy expensive phones to socialize, for work, and it’s a status symbol. They’re often subsidized by the phone company, too, or stolen.

            I don’t think consoles are much like that. The idea in the past was always to have a cheap toy to put under the TV, for children, or for families. The first popular one in Japan, the Famicom, is short for “Family Computer.” They were almost always using older components that could be bought cheaply, much cheaper than a brand new IBM PC or Macintosh or Amiga or Atari ST or any other kind of computer popular then.

            I thought Sony was smart to pull back from “expensive, elegant” and release something more affordable. It seemed to work well for them, at least at first. Maybe they don’t have enough unique games to impress, however.

          • Michael Norris

            Yup have to go with the market,myself i would had spent $500 for a more powerful console.Then again MS and Sony wanted consoles that don’t melt after 5 mins ROD,YLOD.These Cpu’s are crap,but i really think Gpgpu will take off and help some of these issues.

        • Michael Norris

          Ps4 has less bottlenecks,anyone who says it had none are morons.Sony fixed a bandwidth issue,also just because these CPU’s suck does not mean developers won’t improve them.Open GL is getting a 15x boost and DX12 helps out the Cpu,so i expect in the next 2 years these Cpu’s will do much better.

          • Vious

            here’s killzone’s dev saying the ps4 doesn’t have any bottlenecks….http://www.videogamer.com/ps4/killzone_shadow_fall/news/ps4_has_no_performance_bottlenecks_claims_killzone_developer.html
            so i’m guessing he’s a moron then, correct?
            -
            and where have you heard this rumor that openGL is getting a 15x boost?

          • Xtreme Derp

            Every system has bottlenecks, total performance depends on the severity and location of those bottlenecks.

            PS4 has less bottlenecks in less locations. Xbox has memory bandwidth and size bottlenecks, 16 ROP bottlenecks, less GPGPU support, and a weaker overall GPU.

          • Vious

            every system has bottlenecks…so you’re saying the killzone dev was lying correct?

          • Gannicus

            no its not getting a 15x boost. ppl state its 15x faster than dx11. its not getting boosted at all

      • tubers

        That’s what i believe too. It really depends on the software how its using all those “organs”.

      • Vious

        you do have a point, but when devs and articles strut around things like this:
        http://www.videogamer.com/ps4/killzone_shadow_fall/news/ps4_has_no_performance_bottlenecks_claims_killzone_developer.html
        saying ps4 has no bottlenecks that gets blown out of proportion then I think you can understand where @disqus_HLgwiGh7vV:disqus is coming from.

        • Gannicus

          thanks this is what im saying.

    • Somebodyissilent

      Still out doing the xbone.

      • Gannicus

        and? everyone knows the X1 is not to good and if MS really has this pegged for a 10 year cycle, may as well start using the X1 as a tea coaster in 5 years but its about most saying the PS4 has no bottlenecks whatsoever now it turns out theres are bottlenecks so what else is up with it

        • Somebodyissilent

          Oh you xbots and your excuses.

    • Xtreme Derp

      Open world games can be heavy on CPU, or games like Battlefield 4.

      • Gannicus

        yeah true but with the extra power more and more games will use more open world as it gives more options to the player on how they want do stuff so a major part of that the CPU isn’t able to keep up then that’s a pretty serious oversight isn’t it? on both consoles not just ps4

        • Xtreme Derp

          The CPUs could be a weak point for CPU heavy games, but at least PS4 has GPGPU compute and unified memory to try to work around it somewhat.

          • Gannicus

            you cant use GPGPU for everything cus if you could why don’t they just ditch the CPU altogether… stuff like AI I don’t think can be passed off to compute units and there will be a lot of other stuff that cant be shunted to the gpu for processing either.

      • Guest

        Nope. PS4 is INVINCIBLE…the devs were just lazy. 1st party my ass. Did you see the massive downgrade from E3 last year til the release now? Its inexcusable and they should be embarrassed. Talk about Forzaing, they should call it inFamousing!

        • Xtreme Derp

          Infamous wasn’t downgraded.

  • Nicholas Perry

    Not surprised a weak mobile CPU is a bottleneck. But they manged to do pretty well with it. So good for them

    • Gannicus

      with low level nature an minimal driver overhead the CPU is used less but with open world where you have more AI to control and other stuff that’s more cpu bound so the more going on the more it impacts the cpu which is prolly why infamous hasn’t got a lot going on it in terms of population if the CPU is already struggling. not 100% sure but that’s how I understand it

    • Xtreme Derp

      It’s two 4 core Jaguar CPUs put together, it’s weak compared to a Core i5 but still decent.

  • Delsin Row

    thanks adam for making this awesome engine. right now , i just can give you a long clap :D

  • RealityCheck2013

    I didn’t understand any of that :D All i know is like every PS Gen games will look better and better over time :P

  • Dennis Crosby

    This just the beginning of the consoles generation thing will get better for both consoles no matter what fanboys will say or try to prove at the end of the day none of this games will be the same toward the end of the generation. So far developer are doing a good jobs now just enjoy they games that’s all that really matters

    • Xtreme Derp

      Oh yeah, NOW only games matter.

  • Vious

    what I also don’t understand is how can you say: what to do with 8GB
    http://cdn.dualshockers.com/wp-content/uploads/2014/04/iNFAMOUS_02-670×376.jpg
    but then go on to talk about 4.5GB available?

    • Gannicus

      that’s the slide that sucker punch showed, its so misleading to some ppl ive seen some PS4 owners on here saying oh they’ve only used 4.5GB imagine what they can do with 8GB. I was like what? what do ya think the OS runs on etc they don’t understand. If 8GB was available games would use 8GB and theyd prolly be really really good, you could only push textres so high cus of the gpu but could have a lot of preloaded textures… only problem is what RPM HD does PS4 use, cus if someone makes a mistake and puts in a 5400rpm when standard is 7200rpm could cause a lot problems with preloading textures etc as seek times may be lowered etc

      • Guest

        I don’t think the PS4 could handle the heat gen from a 7200rpm HDD. I’m not saying that as dig, but I just don’t think they designed the heat dissipation well enough.

    • Xtreme Derp

      They reserved more than they needed. It’s 4.5GB plus 512MB of flexible
      memory. So 5GB Game 3GB OS. Both MS and Sony will release more RAM to
      devs in the future.

  • Trim Dose

    I also agree on the CPU, I wouldn’t mind to pay few 20′s more for my PS4 if the clock frequency could be a little higher at least 2 Ghz but still I saw this coming, this will hold back the GDDR5 high transfer rate :/

    • Scott Gresham

      Yeah, I was a bit disappointed when they first announced some of the specs. Mid-range GPU, yeah, but the CPU is pretty weak on both consoles. Price you pay for going cheap and low wattage.

      • Trim Dose

        I wonder what kind of machine Sony could make if the price tag was another 600 bucks, Definately Avatar realtime gameplay out of the box :P, still already the order is looking even more stunning than Infamous SS :D. God only knows what can we expect with Uncharted or God of War PS4 games *DROOLS*

        • Gannicus

          the problem I can see know Infamous had to sacrifice cus of weak CPU, what will the order have to sacrifice? maybe not a lot becuae seem be more gfx intensive and low ppl count so AI wont be hit so hard on CPU or maybe its bin delayed because of infamous to balance out weaknesses in the CPU to try get around it???

    • NeoTechni

      It actually is faster than 2 ghz according to the fcc filing. Its u derclocked like psp was

      • JumpIf NotZero

        FCC filing was the Ghz for the ram there buddy. It’s WIDELY confirmed that the PS4 uses a 1.6Ghz CPU and the X1 is 1.75.

        #dealwithit

        • NeoTechni

          “FCC filing was the Ghz for the ram there buddy. I”

          FCC filing is for ALL clockspeeds. That’s the point.
          To emphasize your error, the RAM operates at 5.5GHz

          That’s not 2.something

          • JumpIf NotZero

            ROFL…. Appparently you don’t understand the DD in GDDR5! I literally laughed out loud at your ignorance… You were even right there too with 2.75 and 5.5… Hmmm… What do those numbers have in common? Hmmmm…

          • Xtreme Derp

            You are a worthless misterxcultist, constantly wrong, and abjectly ignorant about technology. You go on insane triades about “move engine power” then have the nerve to laugh at other people. You’re a paranid schizo lunatic just like misterxturd.

          • Winchester

            isn’t it weird how the resident retard only takes 3 comments before he starts insulting people you need to get a life dipshit

          • 3rdworldgamer

            what’s more weirder is that it only took you 1 comment before you insulted someone. what’s that make you?

        • NeoTechni

          “It’s WIDELY confirmed that the PS4 uses a 1.6Ghz CPU”

          Apparently that was never confirmed. I don’t like liars

          http://www.neogaf.com/forum/showthread.php?t=737629

          Substance Engine is an algorithmic texture generation middleware and according to benchmarks published on GamingBolt the PS4 CPU is faster than the Xbox One:http://gamingbolt.com/substance-engi…s-respectively

          Both systems have 8 core Jaguar based CPUs from AMD, but prior to release Microsoft was promoting the fact that their CPU was clocked higher at 1.75Ghz. The benchmarks here imply the PS4 CPU is actually running at 2Ghz in order to produce 14 MB/s versus 12 MB/s for the Xbox One. The other possibility is that these figures are for the whole CPU and not a single core as labelled. In that case it would imply PS4 is using 7 cores at 1.75Ghz versus 6 cores at the same frequency.

          Since this tech is purely algorithmic on CPU and not boiund by bandwidth we can’t look to the PS4′s GPU or GDDR5 to explain the difference. The only logical conclusion is the PS4 has a faster CPU, despite Microsoft’s protestations to the contrary.Sony has never officially disclosed the PS4′s CPU clockspeed.

          • JumpIf NotZero

            I love the great lengths you ponies go through to justify any claim – so long as the end result is “better than xbox”.

            Spin this: http://ps4daily.com/2014/03/playstation-4-cpu-clock-speed-confirmed-at-1-6-ghz/

            Maybe Sony themselves are wrong when they say it’s 1.6Ghz? LOL… You funny.

          • NeoTechni

            There’s nothing to spin. It doesn’t conflict with my claim that it’s being underclocked.

            “I love the great lengths you ponies go through to justify any claim”

            What lengths? It’s a simple google search. You’re going through lengths to ignore the data.

        • Xtreme Derp

          PS4′s CPU performs better in benchmarks and according to devs, despite (apparently) being clocked slightly lower. It’s probably 1.6GHz but nobody but a PS4 “marketing firm” ever stated it, which isn’t exactly the best source.

          http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively
          http://www.neogaf.com/forum/showpost.php?p=94264594&postcount=50

        • You are flat out wrong

          The PS4 has a more efficient GPU.

          Quick, Jumplf! Copy paste another “analysis” from Reddit!

        • Mark Burley

          God what a bunch of techno snobs get yourself some spotcream a girlfriend and some social skills jeez I’d call you nerds but I like science nerd is good you’re more like something that rhymes with nerd.

          • JumpIf NotZero

            Good job on being two months late to the party. And as for “nerd”, yea, EE in automotive, hot girl, hot car, hot salary, so…. Yea…. Nice “turd” joke I guess?

      • Gannicus

        yeah the CPU can go faster than 2Ghz exactly the same as mobile chips are faster than what there set at, but there set at that frequency to keep inline with the thermal envelope of the system. You start increasing the CPU speed without having sufficient cooling and youll start getting system instability etc cus itll run to hot.

  • dranzer1

    CPU bottle neck. This is exactly what DX12 was about. MS probably saw this as issue on Xbox One too and as the current API has stacked cores. It would fix a bunch of bottlenecks.

    • Boerewors

      On consoles devs are already used to optimizing in a way Mantle or DX does; Sony or MS’s consoles won’t really benefit from these optimization tools, because they already have similar, probably more efficient techniques at their disposal. Consoles CPUs en GPUs are usually not that powerful compared to their of counterparts, but since they are customized and meant to perform in a certain way, they’re really efficient. Best example is perhaps the Wii U, which seems to be extremely underpowered of you simply read the specs, but due to optimization and efficient developing they can really get a lot of juice out of the console: a PC with specs like that could barely handle minecraft.
      Optimization will continue throughout the entire gen, and games will definitely look and run better in 4 years they they look now, but don’t expect jumps in performances like we saw with the ps3… Cause the real bottleneck isn’t the optimization of the CPU, it’s the raw power of the CPU; this generation choose a more financially “realistic” approach with cheaper to manufacture consoles, claiming they did this for development purposes. Indie devs aside, I bet the bigger studios would prefer a more complicated but more powerful console to work with, than the current consoles where they already running out of power in the first months: you can’t get more power by learning how to develop for the console, if the power simply isn’t there. There are a lot of pros for the current approach, but it makes me long for the days where consoles at their launch started out as high end consumer electronics instead of mediocre customized PCs.

      • Xtreme Derp

        Devs generally want ease of development, not complex and slightly more powerful.

      • Landsharkk

        Funny, because many devs have come out and said DX12 will improve CPU performance on the Xbox One by up to 50%.

        • Boerewors

          I’m a gamer for quite some time now, and I was promised a lot more than I was given every single generation. 50% is total bs, with this hardware… Never.
          Remember when Sadam imported all those ps2 to link together for his nuclear war…. That’s more believable than these statements made bout DX.

    • Stranger On The Road

      The developer said nothing about OpenGL being the bottleneck, for all its worth, there is more to the game than just DX11, DX12 and OpenGL calls. These calls are only there to draw the scene, to get all the elements on the scene and place them in their place takes more from the CPU time than making the draw calls.

    • Jecht_Sin

      A bottle-neck is also the weakest link. Given the weak GPU and the crappy eSRAM I doubt that one is the CPU in the Xbone. the CPU is probably idle most of the time waiting for the GPU which waits for the data to be moved in the fast memory buffer.

    • Xtreme Derp

      OpenGL and PS4′s API already have pretty good CPU multi-threading support. DirectX 12 is catching up not getting ahead.

      Btw PS4′s CPU performs better in benchmarks and according to devs, despite (apparently) being clocked slightly power.

      http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014

      • Gannicus

        wow the copy and paste is out in full force on this article hahaha

        • Xtreme Derp

          lol

  • Craig Sloan

    If they can get another game out on PS4 and it can look and run better than inFAMOUS SS I will applaud them. inFAMOUS SS is a good showcase for the PS4. Really enjoyed the game just completed it today. I’m sure the console will get better with time.

  • dranzer1

    So the 3.5 gb of Ps4 OS wasn’t a rumor? Why does such a simple OS use a lot of resources when Xbox One has a more complex OS but only takes up 3 gb.(No denying Xbox One has the complex OS). It was Microsoft sides however with software which makes them better with it while Sony likes hardware and that’s make them better with that. This might have played a key role this gen.

    • Gannicus

      everthing needs reserves you think adding nas support and network streaming take 0 memory, its precautionary and that sit

      • Vious

        network streaming? are you referring to game streaming?

        • Gannicus

          no its takes ram to process stuff

      • dranzer1

        A OS doesn’t use up those many reserves unless they decided to put Windows 9 as their OS for a future update :/

        • Gannicus

          or a lot fo future updates/apps you can use/streaming services that all needs ram

          • dranzer1

            Well yes but currently at the state of this, The Xbox One running snap, and Kinect at same time is taking up less resources while Ps4 which basically almost does 3/4ths of Xbox One OS takes up more. MS usually has the software related stuff unparalleled. I can’t see what Sony is going to do with such a amount of OS space reserved. I assume that will free it in the future.

          • Vious

            maybe they’re going to try and do snapped apps in the future…

          • dranzer1

            Maybe… Its just curious of looking at hardware company doing OS.

          • Gannicus

            well maybe sony aren’t as good as OS efficieny cus well MS hmm 3-4 decades of OS functionality, hmm how can they get ther OS footprint lower than sony!!!

        • Gannicus

          windows 9 will require less resources than whatever OS sony is using. win 7 made the resource footprint smaller to be put on weaker PC’s etc, windows 8 went even further and can be put on even weaker hardware without a problem… next!!

    • Craig Sloan

      Correct, Microsoft does software and Sony does hardware. You only have to look at both console to see that.

      • dranzer1

        Yeah, MS is pumping out Xbox One with their Tiled resources and DX 12 while that GDDR5 unified memory is pumping the Ps4.

        • Vious

          Dx12 isn’t implemented yet in the xO and tiled resources aren’t being utilized fully yet.

          • dranzer1

            It is going to be soon. Devs have their hands on it already. we get game at early 2015. 8 months Away.

          • cozomel

            PS4 can do tiled resources too. So why is it a big deal again? Oh yeah cuz MS hyped it and you suckers bought it. Have fun wishing and hoping and waiting on your inferior system.

          • Michael Norris

            Xbone is still hardware bound…reguardless of tiled resource,which no developer uses.DX12 might give it a boost but OpenGL is getting one as well…so that is Ps4,

          • dranzer1

            It is hardware bound but without software, hardware means nothing. Ps4 has more of a custom OpenGL so it kind of depends on Sony for updates to it.

          • Iain Chambers

            OpenGL is getting a lot momentum with Valve pushing Linux and Nvidia and amd all helping

          • Xtreme Derp

            They’ll both optimize their software as much as possible, the difference comes down to hardware.

        • Gannicus

          I think theres a lot of work with the X1 they obviously saw the eDRAM as free AA or someat along side unified memory the thought what if we expanded on that and at the moe its not working out and I think that is down to the big execs wanting a more entertainment device, but now there losing in the gaming section, tbh X1 would be really good if it worked how its supposed to cus id prefer speak to it and tell it what do than have like 4,5,6 remotes to do different stuff, if it did that I might go and buy an X1 until then doesn’t interest me

          • dranzer1

            X1 is like PS3. No one knows how far it can go until you keep going and going.

          • cozomel

            Ahahahahaha you’re another dumb xbots. Damn, theres a endless supply of y’all. You are crazy if you believe that sh*t. They both use the same CPU and GPU, only the PS4′s GPU is more powerful and its memory system is also alot better. All this crap about eSRAM and DX12 is just more desperate fanboy wishful thinking. And y’all are making yourselves look stupid for believing it. Talk about a hive mentally.

          • Xtreme Derp

            It’s ESRAM not EDRAM, different types of DRAM.

            ESRAM clearly wasn’t worth shrinking the GPU for.

          • Gannicus

            edram was in the 360 right?

        • Michael Norris

          Open GL is getting a boost so that should factor in for Ps4.The real issue is the Cpu’s if Sony and MS can free them up both consoles will perform much better.Ps4 has the extra Gpu power and faster memory to really make things hum along.

          • Gannicus

            ghey supposed to have freed them up already

          • dranzer1

            OpenGL depends on Sony. As far as software goes, Ms has no competition there but Sony can pull off something decent to free resources.

        • Xtreme Derp

          Both PS4 and Xbox can do hardware level PRT.

      • Xtreme Derp

        They reserved more than they needed. It’s 4.5GB plus 512MB of flexible memory. So 5GB Game 3GB OS. Both MS and Sony will release more RAM to devs in the future..

    • superkarma

      You’re the delusional fanboy who believes DX12 will solve all of the X1′s problems, so…yeah.

      • dranzer1

        Yes because I’m delusional for saying nothing wrong about either console while you can say every one is delusional you mental outcast.

        • superkarma

          Who are you trying to kid? You troll every comment section. Anyways, like I said, you’re the delusional fanboy who actually believes DX12 will solve all the X1′s problems.

    • Xtreme Derp

      They reserved more than they needed. It’s 4.5GB plus 512MB of flexible memory. So 5GB Game 3GB OS. Both MS and Sony will release more RAM to devs in the future.

  • Almighty-Koz

    me thinks sony needs to figure out how to cut what the ps4 os uses ram wise somehow, i just dont get how the ps4 uses so much ram for backround task as a pc uses nearly half that, hell the laptop im on now has 8gbs ram and running windows 8 and is only using 1.6gb ram with multiple chrome tabs open and a few apps running in the backround like spotify

    • Xtreme Derp

      They probably reserved more than they needed for future purposes.

  • 2223

    I just want Sony to reduce the amount of memory the OS uses. If they could increase the memory amount to even 6GB, that would be great for more demanding games.

    • Xtreme Derp

      They probably will in the future.

  • CervantesPR

    hmm well cant wait till they update the API so that all that power can be utilized, as a AMD cpu gamer this is a pain about all the cores not being utilized.

  • Failz

    DX12 should solve the problem… oh wait never mind…

    • Jecht_Sin

      An ICE team developer already wrote a routine in ASM which boost the performance between 10 to 100x on the tiling/de-tiiing algorithm. Which is a CPU job.

      http://gamingbolt.com/ps4-ice-team-programmer-surface-tilingdetiling-on-the-cpu-is-10-100x-faster-now

      No need for a generic API when you can write directly in assembler. I wonder if at MS they speak assembler, though! :p

    • Iain Chambers

      Well they both ave the same CPU so that’s going to be a universal problem and DX12 is hardly going to help the Xbox much regardless of all the spin MS put out but will be great for the PC

    • Xtreme Derp

      OpenGL and PS4′s API already have pretty good CPU multi-threading support. DirectX 12 is catching up not getting ahead.

      http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014

      PS4′s CPU performs better in benchmarks and according to devs, despite (apparently) being clocked slightly lower. It’s probably 1.6GHz but nobody but a PS4 “marketing firm” ever stated it, which isn’t exactly the best source.

      http://gamingbolt.com/substance-engine-increases-ps4-xbox-one-texture-generation-speed-to-14-mbs-12-mbs-respectively
      http://www.neogaf.com/forum/showpost.php?p=94264594&postcount=50

      • Failz

        Your 100% wrong, hence why this article exists…………. OpenGL doesn’t do what DX12 is doing (yet) And you call other people delusional lol read the headline of this article. Bottleneck. If OpenGL did what DX12 is doing then that wouldn’t be the case. Thank you and good night.

        • Jecht_Sin

          OpenGL doesn’t mean much without its version. That link refers to OpenGL 2014. ISS obviously didn’t use it. While you are talking about DX12.. 2015? :p

          • Failz

            And that’s to say if the PS4 will run the latest OpenGL correctly since it wasn’t designed around it. As for X1 seems like DX12 is made for it.

          • Jecht_Sin

            Oh yeah. So it was DX11.2, the mythical mantra that was supposed to kick the butt of the PS4 for months, right? Too bad we have all seen how it turned out. 792p with frame rate drops to the 20 fps and screen tearing, huh? lol

            DX12 is for the PC. In competition with Mantle and OpenGL (read: SteamBox/Linux). There is little to work around that box. It’s doomed already. One can’t get blood from a stone.

          • Failz

            Lol keep trying hard, DX12 was designed with X1 in mind as well as PC. Wait till the full release of DX12 then come back and flame on. In the mean time its expected for a system not using its full potential to suffer minor issues. How ever its funny that PS4 is already pedal to the metal ready yet the CPU has bottlenecks.

          • Jecht_Sin

            yeah, yeah, yeah. I told you: been there, done that. Change DX12 with DX11.2 and I’ve read what you wrote a thousand times already. Once DX12 won’t do what you expect it will be the time of DX12.1. then DX12.2. Then DX13 and so on for the next years, until the generation ends.

            Because there is no API that can overcome 2 simples facts:
            - The XBone GPU is too weak
            - The eSRAM is too small (and not that fast either. It seems the one on Wii U has a bandwidth of 1TB/s).

          • Failz

            Lol do you think DX11.2 is DX12? I expected better from you.
            Software > Hardware and MS have the Software.

          • MTM2

            You’re acting awful confident in an update you’ve seen no evidence of running on X1 hardware….

          • Jecht_Sin

            You are really that dumb or you play like it? I wrote that you guys have been going on for months saying that DX11.2 would have killed the PS4. Since it more than obvious that it didn’t now you put all your hopes in DX12, repeating what you already said for DX11.2, in a vicious cycle. Because you guys are living in hope, the Fox Mulder or videogames who want to believe!

            Oh, and regarding the PS4 SP said that the CPU has been the bottleneck for them. But they also said that it can still be optimized:

            “There’s room for improvement in the job system using the CPU’s multiple threads and in other areas.”

            As the tweet I posted down below show they are doing already at Sony.

            Your 32MB of eSRAM instead is total crap. because if you had actually read the article you would have read this:

            “Material properties are stored in up to 8 gbuffers (5-6 plus depth/stencil), with 41 bytes written per pixel. That translates to 85 MB for full screen buffers.”

            About 2.5 the size of the Xbone eSRAM. Now please show us how DX12 or any other API can squeeze 85MB into 32MB. Compression is not an option.

          • Jecht_Sin

            And regarding MS has the SW.. That’s why the Xbone SDK this time is a total mess as opposed to the highly praised PS4 SDK, isn’t it?

          • angelo dau

            wiiu has eDram, not eSram, and the system can’d execute tiling/detiling. the gpu on xb1 is 30% weaker than ps4′s gpu, so they are very similar in power

          • You are flat out wrong

            Another sad failure from Failz. You live up to your username.

        • Xtreme Derp

          Your post proves that you’re too stupid to understand what a bottleneck even IS.

      • Failz

        Oh your the most delusional one of them all. OpenGL and the PS4 API does not balance out the CPU Cores like DX12 is bringing which is the first of its type. Keep trying because Sony are great at making people believe.

        SONY – MAKE.BELIEVE

  • frontiermarine88

    console gamers talking about specs is so cringe worthy

    • Xtreme Derp

      Good thing I own a high end PC.

    • Jecht_Sin

      It’s actually exciting. Everyone is able to get more performances just spending more money throwing in faster HW. But where is the challenge? Apart from escaping from the creditors, sure. :p

      With consoles instead the developers have to get the most out of two $400 defined systems (without kinect that’s also the Xbone price). That shows who are the real wizards.

  • MTM2

    Man, trolls having a field day with this article – anything with PS4 and a negative word seems to be enough to trigger.

    All systems have a bottleneck, there’s always the ‘weaker’ component. I just find it hilarious how Xbots are acting clever mocking the PS4 considering how the X1 is performing, come back when Trials is running in 1080p hahaha

    Inb4 ‘Dx12 will save everything’

  • cusman

    I have yet to see or play any game that looks as good as inFAMOUS: Second Son for an open world game where you can rapidly travel in any direction. Oh and on top of that they managed to make the best thing about the game how it controls and how fun it is to play.

    I am hoping their next game takes these strengths and really focuses on the choice/consequence and playing as good, rather than reinforce your decision, you should keep questioning it, and if playing as evil, should also make you question your choice based on the consequences.

    inFAMOUS 2 just in that last mission as evil, managed to really make me question my choice to be evil, even though I am playing a game. If they can replicate the success of that throughout the game (like Spec Ops The Line) and for both morality paths, that would be a masterpiece game.

Recent Comments

Powered by Disqus