In the last few days a rumor surfaced bringing forth the possibility that Microsoft is working on increasing the GPU clock of the Xbox One to react to the PS4's allegedly more powerful specs. The rumor seems to come from a source with a relatively credible track record.

The basic idea is pretty straightforward, and it's based on the possibility to bump up the clock of the GPU in order to raise the horsepower of the console with it, but what does it mean, really?

First of all it's pretty unlikely that, at this stage of development, Microsoft is actually changing the design of the console to house an entirely new GPU. It would simply be a logistical nightmare and possibly not even feasible, considering that production has probably been underway for a long while at AMD.

Most probably we're talking about something akin to the factory overclocks done with many third party PC GPUs in order to offer customers more power for a small increase in price. This doesn't require modifying the card itself, but simply pushing it to work at an higher frequency than what it was initially rated for. It's a fairly easy process enthusiast PC gamers are very familiar with, but it's not without risks.

Most factory overclocked PC GPUs come with a customized cooling solution more advanced and powerful than the stock one from AMD or Nvidia for the simple reason that a higher operating frequency generates more heat. To be precise by raising the clock frequency, the heat output is also raised in a linear fashion.

XboxOne1

It's normally possible to overclock a card by a relatively small amount without performing any other operation, but that kind of overclock has really minimal results, and in order to push the card further without causing critical instabilities it's necessary to raise its voltage. Voltage increase in turn generates even more heat in a quadratical proportion.

Whether Microsoft is going to bump up the GPU's clock by a small amount without raising voltage or in a more radical fashion accompanied by a voltage increase, the inevitable physical consequence is that the chip is going to run hotter, possibly a lot hotter. Anyone that has ever done any overclocking knows this very well.

While this won't cause (quite obviously) immediate failure of the components, increased heat does cause increased wear and tear. It's a simple matter of physics: more wear and tear means a shorter life span for the components.

Of course there are a few things that Microsoft can do to fix the problem: redesigning the cooling solution and replacing it with a more efficient one would be the most effective course of action, but at this stage of development that's unlikely. It's also most probable that the adopted cooling solution already comes equipped with a variable RPM (revolutions per minute) fan, meaning that the fan will dynamically react to the increased temperature by spinning faster and generating a stronger airflow to balance the heat out.

XboxOne2

This, of course, comes with a risk of its own: in order to make the fan spin faster you need to raise its own voltage, and doing so for more extended periods of time increases the wear and tear of the fan itself. It may very well be able to keep the GPU from heating more than it would with its baseline clock, but it would do so at the price of a portion if its own lifespan.

While a burned fan is definitely not nearly as bad as a damaged GPU, consoles are pretty monolithic pieces of kit, and unless you're extremely tech savvy, this kind of damage would still cause a trip to the repairman, without even mentioning that if the proper fail-safes aren't in place a burned fan can very well cause fatal overheating of the components themselves.

Are we looking at a potential "Red Ring of Death" situation all over again? Most probably not quite that bad, as the infamous RROD was caused by severe engineering issues, but Microsoft better be very confident in its cooling solution if it wants to go down this route.

In any case a certain reduction of the lifespan of parts of the hardware is very possible. Whether the burden of the additional heat is shouldered by the GPU itself or by the cooling solution (or spread between both, which is the most likely scenario), more power means more heat, and more heat means more wear and tear. Physics cannot be tricked. Of course we can't know the magnitude of the effect, and hopefully we never will.

XboxOne3

Maybe the real question we should ask ourselves is: is it worth it?

There's a reason why overclocking on PC is normally done only by a smaller percentage of enthusiasts, and that's because even pretty steep clock frequency bumps don't really result in extreme increases in performance. Relatively safe GPU overclocks are between 5% and 10% of the original value, but that doesn't bring a linear performance improvement.

Even if Microsoft managed to bump up the clock by 100 mhz (which is a quite extreme increase going by the most credible leaked specs for the machine, and would most definitely generate a considerable amount of additional heat), it'll only bring the processing power of the console from 1.23 teraflops up to 1.38 teraflops.  That's not really that much, and it would still be largely inferior to the PS4's 1.84 teraflops.

Personally, I don't really know if it's worth it, and I tend to lean towards the "no" camp, especially if it means reducing the lifespan of the console's components. Ultimately, if we have to believe the leaked specs, direct competition in processing horsepower with the PS4 doesn't seem to be on the menu, so it doesn't seem very wise to strain the system more without even getting near to processing parity.

Of course the increase in clock speed is just a rumor for now, and we'll have to see if Microsoft will really go down this route. If they do, we'll have to hope that their cooling solution will be up to the task. After all they're supposed to have learned the lesson the hard way with the Xbox 360.

The pictures used in this article are courtesy of Wired.