News, PC, Platforms, Xbox One

DirectX 12: Here’s How Developers Reacted – Sony ICE Team Dev: “Looks Awfully Familiar”

by on March 22, 2014 6:49 PM 142

The unveiling of Microsoft DirectX 12 caused quite a stir between gamers, but even developers weren’t left unaffected, and quite a few of them reacted to the news on Twitter. Some were enthusiastic, some ironic, others responded with a bit of banter.

Below you can see the reactions we could find, with a healthy mix of the aforementioned attitudes. It’ll be definitely interesting to see how the industry will utilize the new tools to achieve better performance, even if we’ll have to wait over a year to see the results.

Matt Pettineo, Ready at Dawn Graphics and Engine Programmer:

As expected, DX12 goes bindless. Let’s all celebrate by binding 128 SRV’s to a hull shader!

Bindless textures raise the limit for multi-texturing past 32, they’re very useful for the ever-hungry shaders.

Jason Mitchell, Valve Software Developer:

“A critical factor in the adoption of any new API is the size of the available market” – Windows version support info?

It’s worth mentioning that this was a quite common reaction between developers, with many asking whether Windows 7 will be supported or not.

Richard Matthias, Ninja Theory Programmer:

DX12 sounds great. Hope they backport it to Win7 though otherwise you have another DX10 situation.

Also hope that DX12 has a proper extension mechanism so it doesn’t get left behind OpenGL again when GPU hardware advances.

Francois Piednoel, Intel Principal Engineer and Performance Architect:

For those who did not get it, #DX12 will increase the need for single threaded performance dramatically to feed the GPU faster. #leadership

Bobby Anguelov Ubisoft Montreal Senior AI/Animation programmer:

I really like all the DX12 news, no HW reqs, XB1 support. It solves the biggest issues with Mantle: specific HW and no console support.

Steve Bowler, Phosphor Games Lead Designer (to someone asking if DirectX 12 will have to be reinstalled with every game):

Are you kidding? DX12 is so advanced it will install itself twice. Every time.

Konaju Games:

Please America. When you’re talking to a worldwide audience, stop using “Holiday” as a release date. That means nothing outside USA. #DX12

Robert Hallock, AMD PR Radeon & Gaming:

DX12 it’s Microsoft’s own creation, but we’re hugely enthusiastic supporters of any low-overhead API. : )

Cort Stratton, Naughty Dog, ICE Team Programmer:

Looks awfully familiar : ) Glad to see more gfx APIs moving closer to the hardware. New patterns are emerging!

Stratton’s comment is especially interesting: DirectX 12 promises to allow coders to work “closer to the metal,” which is what AMD’s Mantle and Sony’s ICE team have been doing for a while now. It’s nice to see how this is turning into a trend that will span across the whole industry.

Ultimately this kind of APIs will give us more performance for our money, which is definitely something to celebrate.

Join the Discussion

  • Bankai

    So the Xbox One is just now catching up to what AMD have been doing since around last gen?

    • shinitaru

      No, they will catch up to that point around the end of 2015

      • Gabrielsp85

        Do you really think that a MS product will have to wait until it’s full release? Do you even know there’s even an early access? What a bunch of dumb clowns

        • shinitaru

          No, I’m saying that even if they have the code in their hands right now, developers wont have anything to show for another year at least, then the games that really take advantage, maybe another 6 months to a year after that.
          Don’t worry, there’s plenty of room in the tiny car for another clown, hop on in

          • async2013

            Yet in that time Sony’s version of OpenGL will be benefiting from thousands of changes to the API that users of DX can only dream of for another 4 or 5 years and if they are lucky enough to have the right OS LOL

    • Izzy Bozz

      • Bankai

        Can’t be anymore stupid than you.

        • mars mayflower

          Bankai, I believe they are just confused by your statement. XBOX One catching up to AMD? Time for you to reflect.

    • Jimmy DoneGood

      Are you semi retarded?

      • Bankai

        All my therapist said is that I don’t exhibit humanoid emotions, my intelligence has never been called into question as I’m not an illiterate lump of useless skin like yourself.

    • BigDickKamen

      i have both xbox n ps4, and ps4 is boring man, other than good resolution what else does ps4 have? the library of games aren’t even as enjoyable as xbox ones, but if u wanna keep talking about resolutions sure keep playing those games where u can feast your eyes on good quality but play a shitty games such as knack. gameplay > resolution and if u still think not? then fannnnnnnboooooyyy alerrrrt

      • datdude

        Sure you do buddy. Sure you do. LOL!!!!

      • Consolegamer

        Game library comes down to opinion. It isn’t something that can be “better” or “worse” than their competitors that is why there are different genres of games. Not everyone enjoys FPS games, maybe they like Open world games. I love my PS4 and its library of games, nothing on the Xbox is currently making me want to buy one. I played Titanfall on my PC, I got bored of it. Forza is just another car racing game, Ryse was just a QTE game (From all the gameplay i’ve seen its just press this button then this button to finish him of, everything is pre-determined). However on my PS4, inFamous is awesome and I haven’t stopped playing it since release. Indies like Black Light, Warframe, War thunder, Don’t starve are awesome and I get lots of enjoyment out of them. However your opinion may be different. Resolution is something that can be judged as better or worse and sorry but PS4 is better when it comes to resolution but PC is the “Master race” and has the highest resolutions.

      • Prime157

        Fanboy alert because you can’t wrap your mind around the fact that many people like ps4′s games.

      • async2013

        My game collection is better than Xbone’s or PS4′s as I dont play games anymore made after 2005 when games became predictable and boring (fps i’m looking at you!)

  • Reason Freeman

    AMD’s Raja Khodury said on stage. “And it’s not a small benefit. It’s… like getting four generations of hardware ahead with this API.”

    Intel’s Vice President of Platform Engineering Eric Mentzer shared a similar sentiment, with, “This is absolutely, I think, the most significant jump in technology in a long, long time.”

    Nvidia’s Tomasi echoed a similar sentiment, “existing cards will see orders of magnitude improvements from DirectX 12′s release, he said, “going from 100s of thousands to millions and maybe tens of millions of system draws a second.”

    Gosalia “With DirectX12, you’re looking at reduced overhead for state changes; Efficient reuse of rendering commands. Multithreaded scalability. Flexibile and efficient resource blinding model; App-controlled graphics memory heaps; Amortizing create/destroy costs; Ultra-low cost re-allocates; Free threaded resource and heap management; Access to swizzled resources; Compressed resources (Jpeg, ASTC-LDR) in hardware; The lower-level API also allows the app to track GPU pipeline status, control resource state transitions, and control resource renaming.”

    • Bankai

      The PS4′s API gives it all of these advantages already, while it’ll take a year for DX12 to show any noticeable improvements to the Xbox One. Which basically puts the Xbox One a year behind the PS4 performance wise.

      • Xtreme Derp

        Yes PS4′s low level API is already allowing these sorts of things. The Sony ICE engineer team knows what they’re doing.

        • Reason Freeman

          That’s why Infamous Devs bragged about the chip running over 95% the whole time while you are playing. Stunning! Good thing they removed some effects to achieve the full power of the PS4

          • choujij

            RF, you sound pretty insecure. =)

          • cozomel

            He is

          • Xtreme Derp

            Cute, but you should know utilization has nothing to do with efficiency.

            Both Sony and MS have world class coders that will extract every bit of performance out of their consoles. The difference is PS4 simply has more powerful hardware to work with, so it will always stay ahead in graphics performance.

          • jeremyg85

            Like os X and windows? Looks at game box.. nope guess not.

          • PCS4-Box U

            Running a chip at 95% today and running it at 95% in 4 years are going to be 2 completely different things. Do you REALLY think that Infamous:SS is going to be the best looking PS4 game ever when we are only 5 months in?? Of course not, Ryse won’t be the best the XBO can do either.
            I can’t deny that I think Sony is in a better position using their modified version of OpenGL, but it’s like having the same debate about which is better, Android or IOS?

          • datdude

            Is Ryse even a thing? I thought they swept that monstrosity under the rug and tried to pretend it never existed.

          • cozomel

            Ok, its become blantantly obvious that you’re just a dumb a$$ clown fanboy who knows nothing. Cuz if you knew anything you would know that DX12 for X1 means nothing. DX12′s main component is the DX11.x API already running on the X1. And dont talk about Infamous when it looks WAY BETTER than anything on the X1, and MS themselves had said many times how Forza is pushing the X1 to its limits and look at all the downgrades that game had to endure just to run on the X1. Same with Ryse, DR3, KI and just about every other game on the X1, they all had to be downgraded to run on the system. and this is while using the same API thats gonna run on DX12. You clowns dont understand that DX12 doesnt really benefit the X1 but rather the PC.

          • Thinkaboutit

            Why are you so insecure?

          • ratchet426

            I believe you are confusing “insecure” with “fed up having to explain the facts to X1 imbeciles over and over”

          • Thinkaboutit

            Don’t do it then.

          • Ship

            Wow this fanboy has no clue. Name all the fun games your playing on the PS4? Knack? Killzone? Talk about a monstrosity. Or maybe your next gen machine was just made for Indy games all that power reaching potential. I am a fan of games and it seems to me Sony fanboys have a lot of time on there hands to post on message boards because they have no games to play. See you guys later I am going to play Titanfall. I will get my PS4 when Uncharted comes out until then I will be having fun on my Xbone!!! Duh! Its about games not systems. Idiots.

          • Enry

            Yeah, I bet you buy the “newest”, “most avant gard”, “super updated”, “super original” and “most innovative” CoD’s & Xconsoles, or products from Micro & Co. every time they get on the shelf; while they sit on their asses laughin so hard about how nice it feels to “give it to you” (*winkwink*). lmao
            Enjoy your xboneinyourhole 540, get it? hehe

          • Xtreme Derp

            They’re called “far better running multiplatform games” and upcoming Sony first party exclusives we know will be amazing. Sony supported PS3 with exclusives long after 360 became a wasteland of Kinect games.

          • HeczTehFinezt

            Enjoy TF on your $500 XbOne while I play it on my custom PC with TF, costs less than your XbOne and yet its still powerful than both consoles. I dont understand, why waste $500 for an XbOne, when you can just buy a custom PC, or you’re one of those so called gamers who buy a console, just to play one game. Glad I have PC/All PS systems/ 3Ds/X360. Im enjoying the exclusives each console has to offer.

          • jeremyg85

            Hey guess what your machine has more sanders than mine… you know what my machine has? Games. /thread

      • Reason Freeman

        Why are you feeling the need to bring up PS4? Who said anything about PS4? This is a DirectX discussion. Feeling insecure? We haven’t even seen what all DirectX can do. At first, people said DX12 didn’t exist. then they said it won’t be on XB1. Then they said XB1 won’t take full advantage of it. Now they say it won’t help it…..Just stop. At some point people have to realize how retarded they sound right?

        • PCS4-Box U

          He brings a valid point but its not so much the PS4, its OpenGL in general. It, and Mantle actually, already provide the same type of system level access.
          Don’t get me wrong, I’m glad MS improved the API this much because its the most common used API for PC gaming so i’ll beneift. I’m just saying that he has a valid point.

          • Reason Freeman

            Um no he doesn’t. Until DX12 is out and we hear from devs who are actually using the tools, then we have no idea how close it is to OpenGL. Stop pretending you are a fortune reader.

          • PCS4-Box U

            Um, its not hard to find out… Just google OpenGL and read the wiki. It doesn’t really require a “fortune teller” to line up DirectX12 features that have already been announced and compare them to the existing OpenGL.

          • andy

            No dude let him have this. Just like ESRAM this is the most complex thing in the world and not basic whatsoever. There is nothing to prove that DX12 won’t make a 1.3 teraflop GPU instantly double or triple the power on top of running awesome tech you absolutely need in your games console like 3 operating systems and Kinect. So 2015 is now the year to “wait until you will regret underestimating Xbone”

          • YouSuck

            If he Googled it he’d realize he’s wrong, ignorance is bliss.

          • cozomel

            Stop pretending you are a fortune reader. Its the same supposed low level APi thats used already on X1, so what the difference smart guy. And while all you morons that dont know sh*t think the APi is so important. How bout the driver which is even closer to the metal than any APi will ever be. Lets talk about that.

          • vcarvega

            You don’t have to be a fortune reader to compare what they announced to what is already available through OpenGL.

          • Head Blackman

            the ps4 will not be getting mantle. amd already killed that thought a few months back. so why are playstation fans still holding on to this lie?

            http://www.dsogaming.com/news/amd-explains-why-mantle-is-exclusive-to-pc-not-present-on-xbox-one/

          • shinitaru

            It doesn’t need Mantle, same way that the XB1 doesn’t need DX12. They both already have low level API’s.

          • cozomel

            These xbot clowns dont seem to understand this. they just want so badly to believe that something is going to save their weak a$$ X1. they need it, so badly. They’ll believe aything to make themselves feel better. But they are just setting themselves up for further letdowns.

          • shinitaru

            Can’t really blame them, that’s what MS is hoping that they will think. They are trying to give the XB1 a boost off the wave of excitement over the DX12. Average gamer has no clue

          • Jordan

            yet they calling people insecure for calling out the truth. lol how stupid

          • Consolegamer

            Playstation uses a modified version of OpenGL called PSGL. It has lowel level access to the hardware, I’m sure DX12 is better than the latest revision of OpenGL but a while back Valve said OpenGL is faster across all platforms even on windows. If that is still true then that is pretty embarrassing for Microsoft.

          • jeremyg85

            Valve? the same valve whose founder was ousted from Microsoft? the same valve who is trying to make steamOS a thing? They said something bad about dx12? Why would they do that I wonder?

          • Consolegamer

            What they can’t have an opinion? Value own the Pc gaming space, as far as I am concerned their opinion is highly valid.

          • cozomel

            Hey r*tard its obvious you dont know what the f*ck you are talking about. Do you even know what Mantle is? Its a low level APi, do you know what GNM is? Its the PS4′s low level API. Do you know what Mantle was modeled after? You guessed it, GNM. So why would you want to put Mantle on the PS4 when Mantle is a far more generalized API on the PS4 when the PS4 already has a specialized to its HW API? Also, DX12 uses X1′s lower level API (DX11.x) as its main component. Meaning its the same thing thats alreay on X1. You fanbots dont know sh*t and just spew stupid shit.

          • vcarvega

            It was just tech that he referenced along with OpenGL… OpenGL is already in use on the PS4. His point is that PS4 owners and developers don’t have to wait for these benefits.

          • cozomel

            Mantle and GNM are already there, OGL will be above it, DX12 is late to the party and if MS dont put it on Win7, its dead on arrival. But the dumb xbots will deny this.

          • jeremyg85

            OpenGL is just as decoupled as dx is lol.. but when have you ever had facts?

        • cozomel

          Hey you’re the one who feels the need to boast about it, its you who feels insecure. The fact is that Mantle and GNM are already doing this and DX12 is even due out til holiday 2015 and it betetr be on Win7 or forget about it being supported other than in the console space. Have fun thinking this is going to help much, ultimately the HW is still weak. Now go play TF with its ridiculous amount of frame drops and 792p graphics.

          • SIDO

            Confusion 101:

            The 50% reduce CPU Utilization in DirectX will not make the graphics look better, that’s eSRAM’s work, but it will definitely help run the graphics and resolutions like 1080p much, much easier. The display graphics/resolution is actually dependent on the GPU & CPU. The CPU helps to run the graphics and its set resolution while the GPU does most of the graphical rendering work. The higher the resolution is, the more CPU or GPU (it all depends whether the game is GPU Intensive or CPU Intensive) is required to run it in that resolution. Given that DirectX 12 will reduce 50% of CPU Utilization, that will make a big difference for the Xbox One since the resolutions will run much, much easier. That’s a big improvement right there.

            The graphical quality (not display) on the other hand depends a lot more on the hardware’s memory bandwidth rather than the CPU. PS4′s GDDR5 has a memory bandwidth of 176GB/s whereas the Xbox One’s DDR3 + eSRAM has 102 GB/s. This is one of the reasons why the PS4 has better graphics than the Xbox One. However, Microsoft has a trick up its sleeve. The eSRAM, once it becomes optimized and understood by devs most likely Directx 12, can actually be capable of boosting the memory bandwidth up to +200GB/s than the PS4. Who’s know how amazing the graphics can go with that much memory bandwidth?

            To make things simpler, GDDR5 is more GPU friendly (but weak CPU because of high latency) whereas XO’s DDR3 is more CPU friendly (but weak in GPU unless aided like eSRAM). If you think about it deeply, it actually explains why the PS4 had to downgrade Killzone Shadowfall Multiplayer to 720p. Killzone Multiplayer was too CPU intensive for GDDR5 to support at 1080p, so the devs had no choice but to drop the graphics’ resolution until it balances with the CPU.

            Games that run in single player mode generally don’t require that much CPU, which is why Killzone: Shadowfall Campaign runs easily at 1080p. It has nothing do with the PS4 being more powerful than the XO. That PS4 being more powerful in 1080p is just bullshit dogmatism that the Sony Ponies brought up. The fact that the PS4 has a hard time handling CPU intensive games like Killzone Multiplayer, Titanfall, etc. is the main reason why the PS4′s been suffering with latency when it comes to online gameplay or any other computing tasks. XO has a better CPU for all that computing and that’s the reason why Microsoft is using the cloud. PS4 may be powerful in graphics but the graphics/performance suck in online aspects, something that the XO dominates in.

          • static5125

            Nice try, but your attempt at spinning is not fooling anyone. Latency in GDDR5 is not significantly higher than DDR3, so DDR3 doesn’t really have a latency advantage (http://www.redgamingtech.com/ps4-vs-xbox-one-gddr5-vs-ddr3-latency/). esRAM, even when used fully, is not enough to make up for the lack of bandwidth. There is only 32 mb of esRAM, meaning that devs have to funnel several 32 mb packages through the esRAM which is a pain in the butt. Overall, the X1 has 8 gb of RAM that is over 100 GB/s slower than the PS4′s 8 gb of RAM while only 32 mb of the esRAM runs about 25 GB/s faster, but that’s only if it can be utilized to the max. Add onto the fact that the OS in the X1 hogs up too many resources while the PS4 uses FreeBSD, which is a much less resource intensive OS.

            The X1 and PS4 use the same CPU, so the X1 does not have better CPU’s. Cloud is a fancy term for dedicated servers which have been around since forever. Your claim that the PS4 has bad performance online is totally false since it does not have a bad performance online in the first place. That’s a typicall MisterX tagline right there.

            Let me break this down to you. The PS4′s GPU is 50% more powerful than the X1′s GPU. However, there is more than just a 50% more powerful GPU. The PS4′s RAM is over twice as fast as the X1′s RAM, has twice the ROPS, and 4 times the GPGPU granularity. In addition, the PS4 utilizes the Garlic and Onion bus, where the CPU and GPU can directly access the memory pool. What does this all mean? It means that the PS4′s can render twice as many pixels and can handle CPU related tasks a lot more easily. The PS4′s CPU and GPU can utilize the pool of much faster RAM more easily.

            In other words, the PS4 is more powerful.

          • Nibblo

            The Xbox One has a nearly 10% faster CPU. Not blowing the PS4 out of the water but not pocket change either.

          • SIDO

            Tell me , where exactly did i say that XO is more powerful ?

            1) ”The PS4′s GPU is 50% more powerful than the X1′s GPU.”

            The PS4 gpu is not flat out 50% stronger, its anywhere from 33-40% stronger overall. There is only one area where the PS4 gpu has a true major advantage over X1 and that is Gigapixel fillrate. Everything else is only in the 30% ranges.

            Sony already stated that the PS4 OS and features has 3.5gb and two cores allocated for those tasks. The X1 also has two cores allocated and 3gb allocated for OS and features. Both consoles are in the same boat when it comes resources being allocated for those jobs.

            Also there is gpu allocation for the PS4, heck even when the PS4 was being tested had a 14+4CU reserve ratio for compute tasks. and Sony removed the 4 CU requirement to allow the developers to use any ratio they wanted. Also with PS4 camera also going to have facial and voice recognition some of that processing needs will be allocated to the gpu because they only have two cores. The PS4 has abit more reserve pool to tap from without affecting like the X1. So people shouldnt just assume the PS4 gpu is 100% allocated for games and automatically do a 100%>90% ratio when comparing.

            The PS4 bandwidth does not mean much when your target is 1080 or lower and that the cpu only has a 20gb/s lane. The GDDR5 vs DDR3 +ESRAM will not yield any real difference in performance nor any graphical improvements

            We are yet to see what difference DX 12 will do. if anything it will make it easier for devs to program for XO.
            Stop being fanboys and enjoy competition

          • Jecht_Sin

            “The GDDR5 vs DDR3 +ESRAM will not yield any real difference in performance nor any graphical improvements”. yeah, that’s why the XBone games are not struggling to reach 1080p, isn’t it? The eSRAM is a bottleneck. It’s too small, no API will change this fact.

            As for the rest, the posters before me already explained why you’re wrong.

          • SIDO

            ”The 50% reduce CPU Utilization in DirectX will not make the graphics look better, that’s eSRAM’s work, but it will definitely help run the graphics and resolutions like 1080p much, much easier. ”

            *sigh* I got a feeling that everytime I read your reply one braincell die. Just don’t ….

            btw: where exactly am I wrong ? I didn’t say that Xbone is more powerfull nor that eSRAM got anyting to do with 1080p. LOL

          • IA14A

            “Nice try, but your attempt at spinning is not fooling anyone. Latency in GDDR5 is not significantly higher than DDR3, so DDR3 doesn’t really have a latency advantage (http://www.redgamingtech.com/p…. ”

            “The problem with your argument is, you’re excluding the other part:

            “Latency in GDDR5 isn’t particularly higher than the latency in DDR3. On the GPU side… Of course…”

            Latency in GPU is completely different from latency for the CPU. Yes, GDDR5 is faster and powerful than DDR3 but only in terms of GPU/graphics. That’s what Cerny was talking about. Ever wondered why the PS4 is practically clocked at 1.6 as opposed to the practical 1.75GHz clock for the Xbox One? Hmm, I wonder why.

            “esRAM, even when used fully, is not enough to make up for the lack of bandwidth.”

            Wrong. eSRAM clearly has a faster GPU bandwidth than GDDR5. MS claims that via optimizations it is capable of reading/writing data simultaneously, which could lead to an abstract bandwidth that is lower, similar, or greater than GDDR5′s 176 GB/s. This is just an assumption of course.

            “There is only 32 mb of esRAM, meaning that devs have to funnel several 32 mb packages through the esRAM which is a pain in the butt.”

            Yes, it is a pain in the butt. However, 32 mb of VRAM certainly is enough to run higher resolutions or at least 1080p and other graphical applications. I don’t see how that could be a problem.

            “Overall, the X1 has 8 gb of RAM that is over 100 GB/s slower than the PS4′s 8 gb of RAM”

            You’re comparing CPU speed with GDDR5′s 176GB/s GPU memory bandwidth speed. CPU speed is measured as megahertz or gigahertz, not the GPU memory bandwidth. So your ‘RAM being faster than DDR3 RAM’ is invalid. It would be much more reasonable to say that GDDR5 has a higher GPU bandwidth than DDR3′s 68.8 GB/s instead.

            “while only 32 mb of the esRAM runs about 25 GB/s faster,”

            That would only be true if it’s capable of reading/writing data one at time. Simultaneously, on the other hand, can go much higher than that.

            “The X1 and PS4 use the same CPU, so the X1 does not have better CPU’s.”

            Ah, but you can always increase/overclock it. Just because they both have the same CPU doesn’t necessarily mean one can run better than the other. DDR3, on the other hand, has a lower latency than GDDR5 for the CPU. It’s the same reason why tasking computing devices prefer DDR3 over GDDR5.

            “The PS4′s GPU is 50% more powerful than the X1′s GPU.”

            I agree.

            “The PS4′s RAM is over twice as fast as the X1′s RAM,”

            Wrong.

            “has twice the ROPS, and 4 times the GPGPU granularity.”

            Not denying that.

            “What does this all mean? It means that the PS4′s can render twice as many pixels and can handle CPU related tasks a lot more easily.”

            and yet, the PS4 couldn’t support 1080p for Killzone Shadow Multiplayer smoothly but can with single player. I suspect this was caused by the bottleneck of the CPU due to high latency.

            “In other words, the PS4 is more powerful.”

            In graphics? Yes. In computing tasking? No, or at least not as powerful as XO.

          • Xtreme Derp

            Your entire post is pretty much garbage.

            Factual PS4 Hardware Advantages: +6 CUs, +560 GFlops (44% greater), +16 ROPs, +6 ACEs/CQs, better GPGPU support, better performing CPU, and faster unified memory. PS4 OS may also have less overhead or reserves.

            Both Sony and MS have world class coders that will extract every bit of performance out of their consoles with their drivers/APIs/SDKs. The difference is PS4 simply has more powerful hardware to work with, so it will always stay ahead in graphics performance. PS4 will have better performing games for the entire generation, you can’t overcome a hardware gap with better drivers/SDKs.

          • IA14A

            “PS4 will have better performing games for the entire generation, you can’t overcome a hardware gap with better drivers/SDKs.”

            eSRAM can actually help the XO’s run better graphics/resolutions than DDR3 itself. Problem is, developers are having a hard time fully utilizing it. Still, 32 MB is enough to make that a reality.

          • Xtreme Derp

            Nope. It will always be a struggle and potential memory size/width bottleneck.

          • IA14A

            “Nope. It will always be a struggle and potential memory size/width bottleneck.”

            Wrong. 32 MB is more than enough for higher resolutions like 1080p and even AA for the frame buffer. I don’t know where you’re getting that utter non-sense from. Don’t believe me? Just look at Killzone Shadowfall:

            http://static.giantbomb.com/uploads/original/11/118274/2482927-9560085948-x3pTG.jpg

            Only 32 MB for all that buffering just like the 32 MB in the eSRAM. To further prove my point, the Xbox 360 also had its system RAM +10 MB eDRAM. The eDRAM was used as VRAM for the frame buffer, which is exactly what Microsoft had in mind just like they would on the 32MB eSRAM. Your skepticism says 10 MB eDRAM is a lot worse than 32 MB and yet, the 360 was fully capable of running 720p games over it.

            In fact, you don’t even need 50 MB just to run 1080p in a 32-bit gaming system:

            “The frame buffer is used to store the image as it is rendered, before or during the time it is sent to the display. Thus, its memory footprint depends on the output resolution (an image at at 1920x1080x32 bpp is ~8.3 MB; a 4K image at 3840x2160x32 is ~33.2 MB), the number of buffers (at least two; rarely three or more).”

            Source: http://www.tomshardware.com/reviews/graphics-card-myths,3694-5.html

            As for DDR3 + esram, the memory bandiwdth is currently 102 GB/s. However, MS claims that since the hardware is capable of reading and writing data at the same time, the actual memory bandwidth is somewhere between 133-204 GB/s If there is any truth to this claim (and you can’t say no because there’s no certainty if it is or isn’t), then PS4′s memory will range between being 24% faster than the XO or 13% slower.

            eSRAM by the way is super fast and a has very high bandwidth for the GPU. I don’t where you’re getting that ‘bottleneck on the eSRAM’ gibberish from.

          • Xtreme Derp
          • IA14A

            Your point is? All it says (even to the psu OP) is that the eSRAM is only a bottleneck when applying MSAA or any other application that eats up VRAM. It doesn’t say anything about the eSRAM being too small for 1080p gaming itself. Not to mention that Killzone Shadowfall uses 32 MB for all that buffering in the PS4. As for the limitations, there are various ways to counter them. One is that the devs can use tiled resources for less memory space. Another thing is that they can put the back buffer to the system RAM for less memory utilization in the eSRAM. They’ll eventually have to work it out.

            Oh, and I still don’t see how the eSRAM’s bandwidth is an issue for the GPU. Even RedGamingTech admits that the bandwidth is 204 GB/s faster, which is about 13% more faster than the PS4 if you ask me.

          • Xtreme Derp

            My point is it will always be a struggle and potential memory size/width bottleneck.

            Tiled Resources is just more special sauce straw grabbing.

            “200gb/s” is an imaginary bull PR number not achievable in sustained width on real world games. Try 130-150, less than PS4′s sustained rate.

            PS4 will have better performing games for the entire generation as it has more powerful hardware.

          • IA14A

            “My point is it will always be a struggle and potential memory size/width bottleneck.”

            and yet, the Xbox 360′s tiny 10MB of VRAM in the eDRAM is fully capable of running games in 720p. If they can do it with the 360 with so little memory, they can do the same thing with the Xbox One. Bottleneck? Without MSAA, I don’t think so. Even the OP in PSU admits that.

            “Tiled Resources is just more special sauce straw grabbing.”

            LOL! Anyone knowledgeable enough with tiled resources knows that what you said is just flat-out ludicrous. I guess you’re too ignorant to not even realize it?

            Here: https://www.youtube.com/watch?v=QB0VKmk5bmI

            Go to 23:00 and learn how tiled resources can significantly improve shadow buffering with the same or less (one of those two) memory instead of using the same memory with no tiled resources.

            “200gb/s” is an imaginary bull PR number not achievable in sustained width on real world games.”

            Do you have any proof to back that up or are you just calling it bull and expecting someone to accept it at face-value? If it’s the latter, you’re a joke. Memory bandwidth of 102 GB/s would be true if the read/write operations occurred only one at a time. If the read/write are simultaneous, it could (theoretically) go up to 204 GB/s. However, to the best of my knowledge, 133 GB/s seems to be the practical bandwidth. So no, 200 GB/s is not ruled out. Either Microsoft is lying in their mouths or they may actually be on to something.

            “PS4 will have better performing games for the entire generation as it has more powerful hardware.”

            I agree…for now.

          • Xtreme Derp

            200GB/s is napkin math assuming a perfect simultaneous read/write
            each cycle, the sustained rate in real world games is 130-150. Read the digital foundry interviews, they give some benchmark numbers. Project Cars devs have admitted PS4′s mem width is faster.

            Tiled Resources (aka PRT, something PS4 can also
            do at the hardware level) reduces the size of textures, it doesn’t
            shrink the framebuffer. It’s part of AMD’s GCN architecture, it’s not
            magical secret sauce that will make Xbox’s GPU perform better than PS4.

            At least you agree “for now” (stop drinking the misterx kool-aid please).

          • jeremyg85

            This is true… look what Microsoft can do with just 16mb of ram, http://channel9.msdn.com/Events/Build/2013/4-063

          • Ship

            Lmao all ps4 fanboys can do is grasp on to there resolution because they have no games to play on there ” Game System” they keep trying to justify there purchase. The 1080p thing is such a non issue when it comes to having fun games to play. The cloud and developers will figure it out to make xbone 1080p without the frame rate dips you get on the ps4. I like both systems but will not buy a PS4 until they get some games for now I will play Titanfall,Dead Rising 3, Ryse, Forza and get some use out of my system seems foolish to me to brag about 1080p when your games are terrible and not fun.

          • Negi Springfield

            Yeah , Keep telling that to yourself , Fool.

            No games ? What non-sense are you talking about ? InFamous Second Son and Metal Gear Solid V just recently came out and I`m enjoying it , Titanfall ? Yeah , Played that on PC , But enjoy playing it on your Xbox One with 792P dips into 20-30 FPS with heavy screen-tearing.

            ” The cloud and developers will figure it out to make Xbox One 1080P without the framerate dips you get on PS4 ”

            Really ? When will that happened ? Probably when you people get your heads off the clouds.

            You talk as if PS4 is the one suffering from framerate drops , When you have Ryse at 900P dipping into the teens around 15-20 FPS , Ew ! Titanfall at 792P dipping into 25-30 FPS with some extra screen-tearing , Ew !

          • jeremyg85

            Both have weak hardware, but one has games lol..

        • Michael Norris

          OpenGL is getting a 15x boost,so i really don’t see the big deal about DX12.

        • e92m3

          It would be strange if it didn’t at least have true hardware support for ‘dx12′, and I don’t mean NVidia ‘support’, either. Since it does, I’m sure Microsoft is attempting to improve performance.
          Most likely, the xbox is already running something similar to dx12, yet some of the developments of dx12 have yet to be added.

      • Axe99

        Don’t forget that the dev tools for XB1, while based on DX11, will go much deeper into the hardware than straight DX11 on PC. DX12 on PC will make PC more competitive with console, but unless MS shanked it’s dev tools, shouldn’t have as large an affect in terms of utilising the XB1. It will, however, make it much easier for devs to just code in DX12 and have something pretty close to what they’ll need for XB1, making game dev cheaper and easier, which is only ever a good thing :).

        • Temjin001

          yes, the benefit of DX12 on the console side of things is nowhere near as significant as it is on the Windows side.

    • WTFBBQ

      Yeah and another thing which I find weird, is that Radeon R9 2xx series don’t support DX12 on hardware, only software. Xbox One support it on hw.

      • jeremyg85

        You are surprised that a Microsoft machine running modified AMD hardware supports a Microsoft API?

  • Ritsujun

    Dat outdated Miclownsoft.

  • Xtreme Derp

    Once again these API changes to DirectX and OpenGL mainly benefit PCs. Consoles already have low level API access.

    Also, you’re still going to need new graphics cards (or a new console) to support any hardware level features added to DirectX 12, just like previous versions. No, there’s no unused special sauce hardware in the Xbox waiting for DirectX 12 code to activate it.

    If Xbox One’s API is already very low level (which it likely is), any DirectX 12 features added to it won’t do much. If the current API is high level then maybe it’ll have a greater effect.

    • Reason Freeman

      I’m impressed with your internet armchair DX12 / Dev expertise on the subject. But ill take Phil Spencers word over yours. Thanks “DX12 will have impact on XBOX One games written for DX12. Some DX12 features are already in XBOX One but FULL DX12 coming.”

      • Xtreme Derp

        Sure there will be software code improvements, that’s normal in any modern console’s lifetime. How much improvement depends on how low level the Xbox API already is. That doesn’t disagree with what I typed.

      • PCS4-Box U

        riiiight, cause Microsoft is so honest and forthcoming…

        • WTFBBQ
          • Aldo Fornasiero

            So because Sony hasn’t got 100% reliability Microsoft does?

            You haven’t actually countered his point…

          • LGK

            I wasn’t trying to. I just explained him that Sony can lie too.

        • JerkDaNERD7

          Because it makes sense and far from unicorns and fairies…

          Anyone can due their own DD and find these as relevant facts.

      • You are flat out wrong

        Heh, believing a hype monkey. Good one!

      • datdude

        Suurrrreeee….just like dem clooouuudddzzz, right? How’s that working out for ya???

        • jeremyg85

          Quite well actually thanks for asking, 0 lag, 32ms ping, matches that get sorted for me while I’m in Netflix.. how’s that Ps vita remote thing doing?

          • datdude

            Vita is doing quite well with the ps4, thanks for asking. It’s a much better second screen than smartglass, and let us know when you can stream your games to your second screen on that xbone like you can for vita. Oh, and did I mention? The vita had more exclusives in 2013 that scored higher ratings than did Microsoft exclusives. Case closed. Dat cloud. Is that why games take so long to install and load? Is that why the biggest game for the system is 6v6, no destruction in the environment, bots as dumb as rocks, multiplayer only? And still huffing and wheezing to reach 792p? Sure guy, sure. Those clouuudddzzzz are doing “quite well” indeed. Somehow dedicated servers, which a great many games have, is something only clouuudddzzz can do. LOL!!!!!! You have to feel for these poor folks. Underperforming and overpriced makes them very cranky.

          • jeremyg85

            Lol the Vita is a rock, sinking fast to the bottom of the ocean along with sony.. cant even fathom how you could defend it lol.. Did I mention the sun is actually a giant ball of yarn and that it is smaller than a nickel? No proof but case closed. See how silly you sound lol. You have timed how long it takes for a game to load on a system you don’t have? thought so.. The bots are canon fodder, their purpose is to die to give you experience, having not played the game I would not expect you to understand though. What is wrong with multiplayer only? 6 v 6 is perfect for this game, but once again having not played it you would not know lol.. it was a gameplay decision made by the devs, if it wasn’t pc would have more players wouldn’t it? You make this too easy, you can refuse to acknowledge the advantages of the strongest server infrastructure on the planet but that does not change the massive advantage Microsoft gets from it lol. Killzone multiplayer runs in 540p, the ps4 struggling to keep up? Too bad it doesn’t have a stong cloud to power it. There is a difference between some server running a game lobby and azure that powers Microsofts cloud lol, but again I would not expect you to understand, you have a hard time saying cloud lol. You are to easy lol

          • datdude

            Here’s your cloud clown….http://www.igameresponsibly.co
            complete and utter bullshot, exposed…enjoy the smackdown from a Microsoft representative who admits the cloud is absolutely nothing special…Good night, you’ve been neutered after crowing about dedicated servers.complete and utter bullshot, exposed…enjoy the smackdown from a Microsoft representative who admits the cloud is absolutely nothing special…Good night, you’ve been neutered after crowing about dedicated servers. Done and dusted. That’s called getting buttoned up, and yes, it just happened to you. #Facefacts

          • jeremyg85

            You must have given me the wrong link lol because nowhere on there does it say the cloud power isn’t real lol.. it says they currently only allow full fledged developers to use it lol That was pitiful haha here is a real link with proof of the power..

            http://www.lazygamer.net/xbox-360/what-the-cloud-means-for-titanfall/

            Bam, buttoned up, because it doesn’t matter what your name is dude.. the smack down has been put on you bro..

            So sayonara , try tomorrow nice to know yah lol

            EZ, you never had nuts to begin with, daughter.

          • datdude

            I don’t think this turd troll understands how it works… he posts an article that says exactly what I said, that the cloud is nothing more than dedicated servers, and yet he prattles on about some other nonsense the cloud is doing….6v6, bots dumb as rocks, case closed critter. Better luck next time. Nice when these dopes enhance my argument and undercut their own ballz.

          • jeremyg85

            What else could a cloud be? Servers are proxy computers that compute things lol.. what did you believe there was actually some magical cloud lingering somewhere lol.. The power in Microsofts server cloud is massive, case closed critter. Better luck next time bro. Nice when a guy gets owned and comes back with no more facts lol.. The cloud is the server son, get taught bro.

          • datdude

            And the troll of turd quits as he has no facts to support his backwards logic. Nighty night critter.

          • Xtreme Derp

            Remote servers turned into magical and infinitely powerful “clouds”. Must be nice to be a delusional xbot.

          • datdude

            Here’s your cloud clown…. http://www.igameresponsibly.com/2014/03/23/xbox-xbox-live-compute-is-more-of-a-preview-at-this-point/
            complete and utter bullshot, exposed…enjoy the smackdown from a Microsoft representative who admits the cloud is absolutely nothing special…Good night, you’ve been neutered after crowing about dedicated servers.

          • jeremyg85

            You must have given me the wrong link lol because nowhere on there does it say the cloud power isn’t real lol.. it says they currently only allow full fledged developers to use it lol That was pitiful haha here is a real link with proof of the power..

            http://www.lazygamer.net/xbox-360/what-the-cloud-means-for-titanfall/

            Bam, buttoned up, because it doesn’t matter what your name is dude.. the smack down has been put on you bro..

            So sayonara , try tomorrow nice to know yah lol

            Azure is more than dedicated servers newb lol.. it is compute in the cloud, owned noobert.

            EZ, you never had nuts to begin with, daughter.

          • JustGaming

            Bro…. EZ….. Son/daughter? You do realise that you are coming accross as an utter ass clown and reinforcing your stereotype don’t you? And you also realise that, even if we have the infrastructure in place within 10 years to make whatever it was you said up there possible, that it will mean MSoft realise their vision of an always connected, DRM riddled world (or the equivalent of) don’t you?

            Also, Microsoft Studios corporate vice president Phil Spencer himself has clearly stated hard truth:

            “Cloud is a really interesting place to invest and we’re investing a ton in the cloud,” Spencer told IGN. “Whether cloud rendering ends up being the killer cloud feature, I’m a little skeptical that will be where game designers will actually see the promise of the cloud paying off in their games. We’re seeing a lot of our studios putting that power toward the experience that’s running locally on the Xbox One and how the combined capabilities of the device and servers in the sky create a more immersive experience.”

            It’s a while off from what you are implying, and while no doubt it will get better over time, it will be years before ‘the cloud’ can capably offer everyone the experience that you are talking about. It’s likely that the online capability of XBone will be improved before you see ‘the cloud’ handling any sort of graphical processing.

            It’s also worth noting that this isn’t anything Sony couldn’t attempt with Gaikai (which reaches 90 countries at present) and they more-than-likely are in a way. Whatever the case, both could result in offloading costs to consumers, less-than-favourable business practices resuming (I’m looking at you, 2013 reveal) and general haves-and-have-nots woe accross the board for people who’s internet speeds are not exactly efficient. It will be another 6 years in estimate before that happens.

            Now, can everyone just be happy with their f***ing machine and enjoy their games? Thanks.

      • Prime157

        Oh yes, believe the person with the biggest PR biased ever. Is that how you get your info?

      • Jecht_Sin

        So what? To have an impact doesn’t necessarily mean to improve performances. It may allow Xbox developers to write the same games in less time for example. Or it may have some APIs with direct access to some HW features not present in DX 11.

        But the performances gain is in the low level implementation. Both SDK are already supposed to have stripped out some portability layers and to have optimized the libraries code for the specific single HW. They don’t need to be generic for every platform like the standard DX or OpenGL.

      • async2013

        Of course it will have an impact but Philly boy didnt say good or bad impact ;) Coming from the PC scene i really do laugh at the average xboner’s view on how dx12 will be the magic sauce. IT WONT HAPPEN.
        Hmm maybe my old geforece 256 could be hacked to incorporate dx12 and give it 50% performance LMAO

    • Temjin001

      yah, the reason general purpose OS’s like Windows kept resource access limited is to help protect the user from really damaging attacks.
      I have a few concerns. Sure. The benefit of increased performance is there, but what does this say about security?
      And what does this say about MS’s future plans?
      The reason consoles don’t worry about this thing is because they’re CLOSED platforms. Everything created comes from their marketplace (in any orthodox means)
      I think this move suggests more about MS’s continued efforts to make Windows a more closed system. This was the exact projection and reason why Gabe Newell rejected Windows 8 and spearheaded Steam OS.
      This will get interesting….
      perhaps the reason why they abandoned Games For Windows Live was because of these future plans. Everything may be through their control soon enough…..

  • Nicholas Perry

    “For those who did not get it, #DX12 will increase the need for single
    threaded performance dramatically to feed the GPU faster. #leadership”

    I hope this is sarcasm. This is not what we need. Things are already so bound by single threaded performance on PC. And AMD CPUs just don’t have that IPC for Single threaded performance.

    I guess the bigger question is whether it will actually support AF! /sarcasm unlike most PS4 games these days for whatever reason.

  • Dirkster_Dude

    The DX12 announcement can only help the XB1 and PS4. It will probably help the XB1 more because all of Microsoft’s platforms will natively support DX12. Being “closer to the metal” means fewer instructions when your code is compiled to tell the actual hardware what to do. In the coding arena Microsoft has distinct advantage because typically video games on a PC do not use OpenGL – they use DirectX.

    • Orion

      It will only help the XB1. Not the PS4.

      • Dirkster_Dude

        Once again someone who doesn’t know a thing about programming has shared there thoughts.

        • Trebla Remark

          Their

          • shinitaru

            There, Their, They’re

          • Gamez Rule

            Thier?

          • shinitaru

            refresh, I corrected it immediately. I’m dyslexic.

          • Dirkster_Dude

            Yea I always seem to get those 2 words messed up.

  • Temjin001

    Looks like DX12 is something more than the usual lackluster MS effort. Perhaps with Spring break next week I’ll actually install this license of Windows 8.1 I’ve had laying around =)

  • James

    I’ll just nod along and hope like hell nobody notices I haven’t got a clue what all this means…

    *nods in an agreeable manner*

    • Dennis Djoenz

      I don’t know anything about this shit either so I won’t pretend I do lmao.

  • brianc6234

    Just more hype. Like the cloud. None of it will make the Xbone more powerful though.

  • Trim Dose

    So this is the secret sauce then ?

  • RealityCheck2013

    More BS :D + Wasn’t MS Cloud going to make Titanfall on the Xbox ONE look amazing??? It just looks like a Xbox 360 game :D

  • async2013

    Fact is, DX12 is forever chasing OpenGL’s butt hence the ironic quotes. Be glad when dx is consigned to the bin of stagnation forever. It does nothing but help Microsofts bottom line and limits, through proprietary means, game designers and porting.
    Do NOT let Microsoft get a grip on anything, when they do it will stagnate

  • Kamille

    considering how little CPU most PC games use both Mantle and DirectX seem so very useless….

    • Gannicus

      just cus a PC games doesn’t use all that much doesn’t mean its not a bottleneck for the drawcalls of an API. If you’ve seen how mantle is doing especially with multi GPU setups it is needed. You get free performance out of no where cus its actually using the GPU’s power properly

  • Jecht_Sin

    I don’t know how to take this, but it is from Nvidia itself, so hardly biased in favor of OpenGL. It seems that OpenGL 2014 will reduce the drivers overhead by up to 15x:

    http://blogs.nvidia.com/blog/2014/03/20/opengl-gdc2014/

  • vcarvega

    How quickly you all forget… Microsoft also claimed the 360 would benefit from DX11… Never happened. DX11 had profound effects on PC gaming and none on the Xbox. It’ll be a year before developers are even introduced to it… Meanwhile developers will have been making advances in OpenGL for two years on PS4. any way you look at it… The X1 will always be behind.

  • Jamal

    Good news

  • Gath Gealaich

    “It solves the biggest issues with Mantle: specific HW and no console support”

    Since when does Mantle require specific HW?

    • Stranger On The Road

      it is only supported by AMD’s GPUs, no other GPU support them. I believe nVidia was working on their own API as well, I don’t know what have come of it.

      • Gath Gealaich

        That’s like saying in the beginning of the 1970s that Unix is PDP-only. And then, it suddenly got easily ported to Interdata systems. Mantle itself does not *require* specific HW. It’s difficult to imagine an API that starts with multiple implementations simultaneously, though. OpenGL was SGI-only for a while, and look where we’re now.

  • hesoyamdonMonster

    Guysi feel bit of shame to ask but what does DX12 or other DX really do ? how does it improve our games or help developer ?

    • shinitaru

      A bit hard to explain, first, the GPU in your comp or console has a processor, it’s like a CPU but it has graphic specific instructions burned into it. DirectX, OpenGL, and Mantle provide developers with ways of accessing those instructions within their code. As new features are built into new GPU’s new versions of DX are made to exploit them. There’s a lot more to it but that’s it in a nutshell but without it, each game would be required to first establish a way to access the hardware which was how they used to do it before these standards were created.
      I’m not very good at explaining things, I was hoping that someone better was going to step up *shrugs*

    • Stranger On The Road

      Let me add to what @shinitaru:disqus wrote; the GPU is created by a 3rd party [AMD, nVidia, Intel and others in the mobile space]. Each one of those have their own way of receiving and processing data, this means that they incompatible with each other and infact different GPUs from the same vendor might be incompatible as well.

      To make the graphics programmers’ life easier, a group of companies came together and created a Graphical API that would be common to all GPUs, they called it OpenGL. Microsoft wanted to lock the developers’ to its platform, so they dropped support to OpenGL and created DirectX; but they allowed the GPU vendors to add OpenGL API to Windows.

      Now the GPU vendors would create drivers for their GPUs that would support ‘understand’ the common Graphical API. So now the developers will instead create their applications to support OpenGL or DirectX, and then let those APIs communicate to the driver which in turn would communicate with the GPU.

      Application -> Graphics API -> Driver -> GPU [note, the first 3 are all done on the CPU].

      As you might imagine, this does add a significant overhead to utilize the GPU, which is why the console had a greater advantage since the vendor could make the overhead thinner because they have a fixed hardware to target. But Mantle -and now DX12- attempt to improve the performance of the communication between the API and the driver; something OpenGL might be better at. With a recent claim by nVidia and partners [inc. AMD and Intel] that OpenGL has the potential of x15 times speed boost.

      But all the above is just one aspect of the API; another is that it simplify many of the graphics programmers requirement by providing them with a common API that would support them. Such as shaders, drawing and applying textures.

      P.S. a small correction to @shinitaru the GPU follow the API’s new releases, not the other way around.

      • shinitaru

        Thank, a much better explanation.

      • hesoyamdonMonster

        Thankyou for that such well explained regarding the information about DirectX.

  • Stranger On The Road

    Please America. When you’re talking to a worldwide audience, stop using “Holiday” as a release date. That means nothing outside USA. #DX12Konaju Games

    Thank you, I always hated those ‘holiday’ related release dates, just tell us when the bloody thing going to happen and be done with it.

  • Dale

    Meanwhile, in PC gaming land…

  • real gamer pc

    Well piss off 4 doesn’t seem concerned.

  • real gamer pc

    And Piss Off 4 is like “Dang it not again”!

  • e92m3

    Francois Piednoel, Intel Principal Engineer and Performance Architect:

    “For those who did not get it, #DX12 will increase the need for single threaded performance dramatically to feed the GPU faster. #leadership”
    Okay Francois, explain to me how vastly superior threading is going to increase reliance on single threaded performance in anything except burst performance. After all, changes in latency are bad if the baseline performance doesn’t increase, it just introduces variability. According to all data collected so far, you are wrong.
    There’s some real NVidia nonsense with regards to DX12, which in itself is reactionary from when AMD told them about mantle ~ a year ago.
    It’s really interesting to see how stirred up nvidia and intel have become.
    To be clear, I think DX12 had to happen and developers (I mean the real developers, not the UT engine copy-past script kids) are going to get used to porting to dx12 in the meantime with mantle.
    In the end, I would hope dx12 outperforms mantle, Microsoft has much more freedom to increase performance.

Win a PS4 Destiny Bundle from DualShockers! in DualShockers' Contests on LockerDome

Recent Comments

Powered by Disqus