Saturday, May 6, 2023

An Essay on Why Broken PC Games Are Not New

 

 Jedi Survivor, one of the aforementioned broken games.

Gather round the table, young folks, and let me tell you a tale. There's been a bunch of controversy on how every big Triple A release this year has been broken to some extent on PC. Deadspace had its incessant traversal stutters, The Last of Us ate GPUs with 8 Gigs of VRAM for breakfast, and Jedi Survivor is CPU bottlenecked in certain areas while also plagued with stutters. Redfall, one of Microsoft's big releases this year, has launched in a completely broken state on all platforms. For every Hi-Fi Rush (which ran and looked great) there's been a Forespoken or three. After years of inflated GPU pricing, gamers are fed up. I don't blame them; I'm tired of every Unreal Engine game having shader compilation stutter, and since over half of the industry seems to be moving to Unreal, it's becoming a real problem. But it's not a new problem, folks; Reddit is acting as though publishers have just started to release broken games. We had almost a decade of decent releases, which was an anomaly caused by the conservative tech in the PS4 gen of consoles. PC CPUs were so far ahead of the ancient Jaguar CPUs powering the PS4 and X Box One that I rocked an i5 2500k for nearly eight years, and I was still able to play games at roughly console quality by the end.

Hi-Fi Rush, a polished release.

Yet if we go back to the nineties, when I first started gaming, we'll see a different story. Back then, multiplatform games weren't a thing; series like Half-Life and Diablo were developed exclusively for the PC; whereas each platform had its Zeldas and Final Fantasies, which would seldom ever be ported to personal computer. Nowadays, people often blame multiplatform releases for the sorry state of a game's PC version, citing developers' insistence on prioritizing the console versions at the expense of PC optimization. There's likely some truth in that--games are becoming incredibly expensive to develop and as budgets bloat, cuts have to be made--but even in the halcyon days of dedicated PC releases, things weren't anywhere as rosey as people imagine. Unreal, the classic first person shooter developed by Epic that features the original iteration of its now ubiquitous graphics engine, featured minutes long load times and ran like molasses on mid-ranged hardware. Thief 2, Looking Glass Studios classic stealth sequel, required a day one patch that was over 200 megabytes in the era of dial-up internet. You think a shader compilation microstutter is bad? Half-Life had loading screens every fifteen minutes of gameplay, and they took several seconds to load on the hard drives of the period. Nobody ran Quake at a steady 30 frames per second at release; you were lucky to get twenty. Fast forward to 2004 and id Software's much anticipated Doom 3, which featured cutting edge graphics far beyond what was normal for the time, ran at 30 fps on the hardware of the period. A couple years later, Crysis also struggled to break 30 fps on a high end PC. I remember all of this because I lived through it, and unfortunately I can't find any articles to back up my memories, so you'll have to trust me or watch a playthrough of some of these titles on period appropriate hardware a la Digital Foundry Retro

 Plague Tail: Requiem has next gen graphics with decent PC performance.

What is happening now has happened before, usually during the start of new console generation. The PS5 and X Box Series X have hardware similar to midrange PCs (A Ryzen 5 3600 and a 2070 super are comparable PC parts) while also featuring unique advantages over personal computers, such as shared system memory and dedicated APIs for thing such as texture decompression. Without an older generation of consoles to optimize for, some of the work that would've helped a PC release run better isn't being done anymore, and time-strapped developers have started relying on less than optimal techniques, such as using the CPU to decompress assets on the fly, which is why The Last of Us requires such a beefy CPU compared to the PS5. The consoles also have a shared pool of at least 12 gigs of memory to use as VRAM, which means GPUs with less memory might struggle to run console equivalent settings. All of this sucks if you just upgraded your PC during the pandemic when graphics cards were ridiculously inflated in price. Your RTX 3070 is likely 30 percent faster than a PS5, but with only 8 Gigs of VRAM, it might not match a 500 dollars console in texture settings on the newest games.

Jedi Survivor may have too many frame drops, but it still looks gorgeous.

Another fact to point out is how much standards changed during the PS4 generation. Triple digit frame rates and resolutions higher than 1080p were not the standard back in the day. Until 2020, I never played a game at a frame rate over 60 frame per second, and most of my gaming was done at frame rates between 30 and 60 fps. Until I bought a good ssd, loading stutters were common and expected. What I'm saying is that a game like Jedi Survivor would've been considered performant by a gamer back in the early 2000's. I certainly played the Witcher 2 at lower frame rates. This is not to excuse the state that Jedi Survivor released in--it obviously need more time in development--but not every game is going to be a locked 60 fps now, never mind a smooth 120. Games like Plague Tail: Requiem are locked at 30 fps on the console, and that will become increasingly common as this generation progresses. We shouldn't expect PC releases to scale like they did in yesteryear. All I'm saying is stop the doom and gloom and hold off on purchasing unoptimized PC games if you can't stomach unstable performance. It will get better eventually, people. It always has.

   

No comments:

Post a Comment

Video Game Review: Evil West

  Evil West is a western-themed horror shooter by Polish developers Flying Hog, who are known mostly for their reboot of the Shadow Warrior ...