Home Tech God of War Ragnarök has six different graphics modes on PS5 alone

God of War Ragnarök has six different graphics modes on PS5 alone

0
God of War Ragnarök has six different graphics modes on PS5 alone

[ad_1]

Depending on what features your TV supports, there are as many as six different graphical modes you can use to play Sony’s upcoming God of War Ragnarök on PS5. That’s according to a promotional image released by its developer, Santa Monica Studio, which shows the different options available to you depending on whether you prefer a higher frame rate or a higher resolution and whether you have a 120Hz TV that supports variable refresh rates (VRR).

So if you want your gameplay to be as detailed as possible, then you’ll want the 4K / 30fps mode, but if you want it to be smooth, then there’s a 60fps option that can drop its resolution down as far as 1440p. But if you have a 120Hz television, then the quality mode changes to a 40fps target with a dynamic resolution that can drop as low as 1800p, and the performance mode locks its resolution at 1440p (while unlocking its 60fps frame rate). The calculus changes once more if you have a VRR-compatible TV or if you’re playing on an original PS4 or PS4 Pro. There are a lot of options, is what I’m saying. 

Historically, at least part of the appeal of console gaming has been its simplicity. You buy a game, boot it up, and can often start playing without even having to touch a settings menu. 

But that’s been changing in recent years. I remember seeing these kinds of graphics mode choices growing in popularity when Sony and Microsoft launched their mid-generation PS4 Pro and Xbox One X console refreshes. 2018’s God of War, for example, offered a choice of running at 30fps with an upscaled 4K image or 60fps at 1080p, and it was a similar story for Rise of the Tomb Raider on the Xbox One X.

And now, with the advent of 120Hz and variable refresh rate TVs, as well as current-gen consoles that can take full advantage of them, the amount of options offered by the latest triple-A games is ballooning. Now, you don’t just get to pick between 30 and 60fps. A game like Uncharted Legacy of Thieves offered a choice of three different framerates at launch (30, 60, or 120fps) and, with a recent update, got an additional 40fps mode for even more flexibility alongside VRR support. Digital Foundry now counts its number of graphics modes at six, each with a different combination of frame rates, resolutions, and scaling. And other first-party Sony games like Ratchet and Clank and Horizon Forbidden West have been patched with a similarly broad range of options. 

Putting these kinds of choices in the hands of players is hardly a bad thing. Personally, I’m more than happy to put up with a lower resolution if it means I can play a game at 60fps or higher. But I wouldn’t begrudge anyone who wants as detailed an image as possible and isn’t bothered if that means they have to settle for 30fps. I’d hate for that decision to be taken out of my hands.

Is this transitional or the new normal?

But the sheer quantity of different options is making these decisions complicated, especially when some of these graphics modes work better than others. Take The Last of Us Part 1, which shipped with five different graphics modes depending on whether you favor performance, fidelity, and / or have a 120Hz or VRR display. In theory, you should just be able to choose between prioritizing a higher resolution or higher frame rate and then pick the highest-specced option your hardware supports.

But an analysis by Digital Foundry pointed out that the game’s 40fps mode struggled to actually hit this framerate and instead saw it hovering around 35fps. If you really need a full 4K resolution (and don’t have a VRR display), you’re arguably better off with the more consistent locked 30fps mode. You could argue that giving players the choice means they have the option of avoiding this poorly performing graphics mode; I’d argue that a badly optimized mode shouldn’t have been included in the first place. 

It would be silly to claim that these settings are anywhere near as exhaustive as what’s offered by the average PC game, where you can tweak everything from texture details to what kind of anti-aliasing a game uses. But it’s a lot more complicated than what we’ve come to expect from console games in the past.

Kratos and Atreus in Muspelheim.

Kratos and Atreus in Muspelheim.
Screenshot: God of War Ragnarök

Hopefully, this era of bloated console graphics settings is temporary. After all, TVs are in a real transitional period at the moment. There are still a lot of older 1080p sets out there and 4K 120Hz models with variable refresh rate support are still in their infancy. Supporting these new technologies inevitably means building games that are flexible enough to support mainstream 60Hz TVs while also offering the option of new technologies for customers with compatible hardware. Hopefully, over time, 4K / 120Hz / VRR TVs will gradually become the rule rather than the exception, and toggles for their features will be less necessary. Or, even better, I’d love a future where games are able to run at both 4K and 60fps, no tradeoffs required.

When my colleague Tom Warren reviewed the Xbox Series X in 2020, his main takeaway was how PC-like Microsoft’s new console felt, thanks in part to the graphical choices that the console offered. As the generation has progressed, this feels truer than ever. The big question, now, is how much of this change is temporary and how much reflects console gaming’s new normal.



[ad_2]

Source link