The Common Sense Gamer put up a rather absolutist view on the concept of “future proofing” a game. In my usual style, I started writing a response that soon went long enough to become a post of its own. So I left him the first paragraph of my comment, and skulked back to my lair to finish up my reply.
Future proofing, at least in gaming, is often confused with pushing the envelope when it comes to system requirements. What it really should be is making sure a game will continue to be competitive in the market for some time to come. You want to make a game that looks and plays well today and tomorrow. On the flip side, you do not want to make a game that looks good on the day you start work on it, but looks dated the day it ships.
Vanguard: Saga of Heroes is, of course, the current MMO poster boy for badly planned future proofing. To be fair, much of the desire to do this stems from how things played out with EverQuest. While EQ seemed bit daring at the time for requiring a 3D accelerating video card, there were already games out there driving the 3D market. All you had to do was add a 3Dfx Voodoo2 card to your current system and you were set. (I had a friend who put a pair of Voodoo2s in his system because, back in the day, you could configure them to work in SLI mode.)
But even after helping to push the 3D market, EverQuest ended up looking dated graphically in only a couple of years. This lead to the regular routine of upgrading the graphics engine and various models in EQ, something that seems to have left a mark on Brad and company. So they decided to get ahead of the curve. Unfortunately, they made two mistakes.
First, as has been well discussed, they got themselves too far ahead of the curve. The system requirements on the box do not even tell the story, as Brad has posted that you really need a modern motherboard and components to be able to play the game smoothly. Those of us still in the land of AGP need not apply.
The second mistake they made is that while Vanguard may have pushed the envelope for graphics processing requirements, the payoff wasn’t there. Sure, there are very nice views of distant locations and fine detail in much of the environment, but things that make a game feel alive, like how characters move, are still stuck back in the EQ days. Awkward character animations combined with using the same damn font as in EQ to hover over everybody’s head allowed Sigil to create a game with advanced requirements that felt dated on day one.
So here we have future proofing done wrong. Sigil reduced their potential player base without making the game compelling enough to those who could run the game to generate sufficient buzz to make others want to upgrade just to play.
Does that mean system requirements should be kept in check?
Maybe. Or Maybe not.
A lot of games hit the market with system requirements that really need the next generation of computers to allow them to be experienced as the designers imagined. One of my favorite RTS games of all time, Total Annihilation, comes to mind. That game was a bear to run back in 1998. You had to keep your battles small. Similarly, I would argue that the entire Civilization line of games (including Alpha Centauri) required the next generation of processing hardware to appreciate fully.
Those games were financial and critical successes. So pushing the requirements envelope is not necessarily a death blow to a good enough game, though it still undoubtedly limits the audience. But those games both delivered a payoff that was worth waiting for, or upgrading for.
Sigil, when it came to Vanguard, went too high on settings and too low on payoff. The lesson here is that merely turning the system requirements knob to “11” is not enough. If a game demands high end equipment, it needs to deliver a high end experience.
SOE was more conservative than Sigil when it came to EverQuest II. The systems requirements were steep for the day. You really needed 1GB of RAM to play the game and 2GB was even better, but at least video requirements were a bit more flexible. While there was clearly an 11 setting (extreme high quality!) I still managed to run the game initially on my nVidia 5700 when it came out and could run it on a box at the office with a sub-100 dollar ATI 9550 video card without sacrificing huge amounts of quality. Sure, Zek wasn’t fun and I couldn’t run a lot of other things on the box at the same time, but I did not have to go down to the “blurred face” quality setting.
And the payoff for the system requirements was decent. EQ2 is nice to look at, the character models are distinct and attractive, and the animations, especially in combat, are a huge advance over those seen in EQ and may be the best of any US produced MMO.
And then there is Blizzard and World of Warcraft. WoW, is of course, future proofed in its own way. Stylized graphics and a lot of very effective texture tricks make WoW look very full and rich without taxing system resources. The specific “look” that Bilzzard has given WoW will keep it from looking dated in the way that more photo-realistic games tend to over time. Some say WoW looks cartoon-like as though that were a negative, but that very element helps it. A cartoon, after all, is supposed to look like a cartoon.
Blizzard has its own formula for deciding what system requirements are going to be acceptable by the time a game ships. One benefit of their past success (there I go again) is that they seem to be able to pick the right spot to land a couple of years in advance when it comes to system requirement. I remember when they initially announce Diablo II. One of the early requirements to experience the full potential of the game was a 3Dfx based video card. This caused something of a stir long in advance of the game coming out. Of course, that also provided feedback to Blizzard. In the end, that requirement fell away, in no small part to the fact that 3Dfx was clearly falling out of favor in the market. A more available standard, DirectX, was chosen. When the game shipped, it was in a sweet spot for system requirements and sold very well.
So future proofing is many things. Some companies do it well, some do it okay, and some fail. There are lessons to be drawn from the failures, such as Vanguard, but “Future Proofing = BS” is not one of them.