In the early days of the PS4 and Xbox One, I was full of optimism for almost every new big game releases, just like I had been with the 360 and before that. But as these machines matured, my enthusiasm waned – beaten down by all the titles that over-promised and under-delivered. Nowadays, unless a release has been showered in awards and has a lot of positive sentiment around it, I ignore it. Even if the reviews are good, I will still look for reasons not to buy.
Have I really changed that much? Or are there other reasons? Firstly, it’s certain that I’ve changed. There is no doubt that I have less time available. And after decades of internet use, my attention span could be better (although it is improving – more on that in another article)! I have more non-gaming interests too, so in the short chunks of spare time I have, I’m more likely to spend it on things like watching YouTube videos about cameras, old tech and other nerdy forms of procrastination.
Even so, I don’t think I’m alone in feeling underwhelmed by modern gaming. And it’s not so much that I’m getting left behind, it’s that there’s a number of trends that have had a negative effect on the pure fun of gaming.
Search the web for ‘worst console generation’ and you’ll find other people suggesting the same thing. In each of those discussions there’s a range of views, but there’s definitely evidence to support the argument.
So what’s up with these machines? Well hardware-wise, very little. It could be argued that the move to more generic PC-style hardware made them less exciting in the run-up to launch. With the PS3 we got to debate how mind-blowing the Cell Processor would be, but this time round it was much easier to gauge how the PS4 and XB1 would perform because they used PC components that were only lightly modified.
While hardware isn’t really the problem, there is one big change from the previous generation that has had an effect on game design and the priorities of studios – the amount of RAM available to games. Both machines have 8GB of RAM, and even allowing for OS overhead and the lack of dedicated video RAM, it’s still a massive increase over the 512MB of the Xbox 360- far bigger than the CPU or GPU gains.
Game developers now had to build for machines that could handle far more complex environments, with bigger maps, more detail and realism. This sounds great for the gamer, but it hasn’t quite worked out that way. Game development suddenly got much more expensive – head count went up, timescales increased, and it seems to me that priorities got mixed up. Instead of starting with a game design, and putting it into the most realistic environment they could build, it looks a lot like they are starting with the goal of building an ultra-realistic environment, and then trying to figure out how to get gameplay into it. Teams went global – with game devs in the west having to build a game inside environments created in countries like China, where labour costs are cheaper.
To compensate for the lack of pure gameplay, and even a story in some cases, we were sold new paradigms like always online, with the promise of emergent behaviour and endless variety. Completion times increased, so value for money initially seems good, until you realise that a lot of this time is spent travelling through an open world, or working some grind system to get better gear. Not much of this sounds like gameplay to my old-school sensibilities.
One of the most offensive side-effects of this ‘environment-first’ approach to game development is the number of times we pay a lot of money for a work-in-progress game. The list of unfinished or underdeveloped games is long – early on we had Destiny and Watch_Dogs. Then there was The Division. Even with new consoles around the corner, game devs still haven’t got to grips with this issue – recently we’ve had Sea of Thieves and Anthem. Check out the final paragraph of this Polygon review of The Crew 2. It’s really something to see the reviewer admitting it’s is a mess but you should buy it anyway in case it gets good two years later. Ubisoft must have begged or bribed Polygon to say something positive and try to generate a few sales for another game that cost millions to develop but finished up half-baked.
Another major issue that I plan to visit in more depth shortly is the ongoing shift towards online-only or hybrid-online games. There’s a few big downsides to this sort of game that can alienate a decent chunk of the gaming audience, and it’s not done out of generosity – it’s much cheaper than writing complex storylines and paying actors and voice artists.
But it’s not all doom and gloom – this generation has produced some stunning AAA single-player, story-led titles like Uncharted 4, God of War 4, and Horizon: Zero Dawn (a personal favourite). On the Xbox side, Gears of War 4 was pretty good too, and Microsoft knocked it out the park with racing games thanks to the Forza and Forza Horizons series. We’ve had brilliant smaller games like Resogun, Rocket League, Ori and the Blind Forest and more.
The question now is whether the next generation of of consoles makes things better or worse. The shift to full 4k visuals will push development costs even higher, but increases in GPU power might be used to hit higher frame rates and improve responsiveness.
Do you still approach game releases with the same excitement as you did in the past? Do you think the next generation of machines will be a welcome change of direction? Let me know in the comments!