Nowadays, before the release of every major game on the new generation of consoles and PC, and with every new trailer revealing new gameplay, the ones who I’ll call “downgrade witch hunters” come out to play. Tens of forums threads and even articles on the media pop up showing screenshot that allegedly prove a “downgrade” of the game’s graphics from a previous showing.
There’s basically no escaping it. No matter how good a game looks in its final incarnation, someone will decide that he found the “smoking gun” of a downgrade, and will spam the internet with it, often sparking large waves of negativity. The latest victim of this trend is The Witcher 3: Wild Hunt, which incidentally happens to be a visual masterpiece, but there have been many cases before.
The causes of this kind of trend are multiple. Some media outlets normally do it because “downgrade” posts inevitably turn into a wagonload of hits. Negativity nowadays sells more than sex, and gamers are attracted to it like moths to the flame, like modern-day flagellants relishing the act of whipping themselves by ruining their own perception of their hobby. Social media users do it for a variety of reasons ranging from attention seeking to the desire of raining down “justice” on those “lying” developers that dared to “downgrade” their games after showing us something potentially different.
I can’t even claim to be innocent. I did it once as well with the lighting of Dark Souls II at launch. While that was a pretty blatant case written when this whole “downgrade” fiasco still hadn’t fully exploded in its full virulence, I still feel dirty inside every time i see that article. It definitely isn’t a high point of my time as a writer (actually, it’s one of the lowest), and you can be sure that nothing like that will ever appear again here.
The biggest problem with the “downgrade” agenda is not necessarily that it’s done maliciously – I’m pretty sure that most of the times those that write about it actually believe in what they’re writing – but that it’s often inaccurate and misleading, based on an extremely superficial analysis that doesn’t even get close to representing reality.
Game development is in a constant state of flux, from when the first line of code is written, to when the game is shipped, and even after that. Developers constantly find new solutions and swap design elements in and out to min-max performance and visuals, squeezing every bit of juice from a platform.
Polygon count, textures, compression, shaders, special effects, lighting… Everything is iterated again, and again and again, over the years of development, in thousands of little or big adjustments that have a single objective: delivering the best product possible.
Does anyone really believe that developers are intentionally aiming for a worse product? The better the final game is, the smoother it runs, the shinier it looks, the higher the chances that they’ll get high sales and critical praise, leading to multiple advantages, that go from bonuses to simply keeping their jobs for one more game. When their livelihood is at stake, people tend to do their best.
When a game is first shown, often years before release, most of the elements I listed above are not final. Placeholder assets are used all over the place in order to create something to display. Effects and shaders are often dragged in from previous games in order to complete a scene, gameplay elements are barebone if they are even there, and only a minimal part of the world is implemented.
It’s simply not realistic to demand for final games to look identical to what was shown close to the beginning of their development. Not only most of the elements aren’t finished at the time of a game’s first showing, but all platforms evolve rather radically in the one-to-three years span between a title’s reveal and its release. New APIs are introduced, new features are made available to developers and so forth. Engines themselves evolve, adding a further degree of complexity to the issue.
Some will probably say “then developers shouldn’t shoot so high with their reveals!” But where do you draw the line? How do you predict where the platform(s) will be in three years? How do you assess what kind of technical solutions your coders will devise in between 365 and 1095 days at their desk?
What developers can do is to attempt an approximation, in order to give us a rough idea of what a game will look like. But approximations are by definition rarely exact, and we should keep this firmly in mind when we assist to a game’s reveal.
The core of the problem, though, isn’t that “downgrades” are justified by the provisional nature of the assets used, but that most of the time they aren’t even actual downgrades, and the “evidence” provided to show those “smoking guns” simply isn’t evidence at all.
If you examine the typical article or forum post about downgrades, you’ll notice that it follows pretty much always the same formula: it’ll include a few screenshots (often ripped from super-compressed videos and livestreams), showcasing some elements that look arguably worse compared to previous showcases of the game, then the author will call it a downgrade and call it a day.
That way of approaching the issue is simply inaccurate, as it cherry picks just a few elements of an image that fit the “downgrade” agenda, without considering the whole picture.
A game’s screen isn’t composed just by those elements, but by the combination of all the models, textures, effects, lighting, shaders and so forth. Without analyzing the entirety of those elements, pointing fingers at a few and screaming “downgrade!” is simply misleading.
During the min-maxing and optimization process of a game, developers often notice that an effect is too costly (in terms of performance) compared to what it actually offers, or maybe it simply doesn’t fit the art direction of the game.
That effect is removed, and replaced with another, that may or may not look a bit worse. If the new effect requires less resources, often those resources are used in other areas of the game’s visuals, to improve elements that might be sub-par, or that simply offer more gains for the cost.
The usual “downgrade hunter” will see the first effect change arguably for the worse, photoshop a nice circle around it on the screenshot, and start screaming “downgrade!”
Of course he won’t notice the elements of the picture that have been improved, or he will neglect pointing them out. But that’s not a downgrade. It’s a trade-off. And trade-offs are at the very core of game development. Thousands are made during the course of a title’s coding, but what people point out are pretty much always what are (more often than not incorrectly) defined “downgrades.”
Things get even more misleading with games that feature different lighting and weather conditions. Lighting (and weather affects lighting as well) can massively alter the appearance of certain elements. For instance shaders dictate how materials react to the light, and can look entirely different in different times of day. Many posts about “downgrades” don’t even take this in account, and allege that, for instance, less dramatic shadows are due to worse effects, instead of simply caused by the fact that the sun is in a different position.
Of course anyone can decide that a game’s visuals look subjectively better or worse in his or her eyes, but in order to properly gauge whether they’ve been “downgraded” or not, simply cherry picking a few bits of a few screenshots isn’t sufficient. One would have to know every single effect and element churned by the graphics card, and we simply don’t. There’s no two ways around it.
The whole “downgrade” hubbub, though, isn’t just inaccurate, it’s also obnoxious and toxic. It creates often unwarranted negativity that affects the perception of a game in a way that most of the times isn’t justified.
A lot of the games that received accusations of “downgrades” – and The Witcher 3: Wild Hunt is just the latest example of many – look absolutely beautiful, but the perception of that beauty has been altered in the minds of many by the allegation that there has been a downgrade, and that gamers somehow received a sub-par product. This simply couldn’t be further from the truth.
It’s quite common to hear lines like “it looks great, but the E3 201X trailer looked better.” That “but” is the common result of all the recent focus on hunting for downgrades. It reduces people’s appreciation and enjoyment of a game, regardless of its actual quality, because they feel they’ve been somehow “tricked,” despite the fact that the product they purchased (if they bought it at all) looks awesome in its own right.
Even not considering the fact that the focus on graphics is often excessive, and whether a game is fun or not risks to be neglected, a title’s visuals should be judged on what they are, not on what some trailer broadcasted years ago showcased.
Does the game look great or not? Binary response: yes or no. You choose.
If it looks great, who cares if there has been a trailer that arguably looked a little better?
We should focus on here and now, and judge a product on its own quality, and on what is actually running on our screen. I can pretty much guarantee that this kind of attitude will help you enjoy games better, and retrieve that warm sense of wonder that many have lost due to the “downgrade” media and social media bombardment.