Of course it is. As always a choice between visual quality and framerate.
DF did a pretty decent video on the whole 30fps question. https://www.youtube.com/watch?v=i9ikne_9iEI
30fps for top end consoles in 2023 is absolutely pathetic.
Not really, developers always push hardware more as the generation progresses and these machines are three years old now. That’s almost half a generational cycle.
What irks me is when game developers ties the physics engine to the framerate. We all know this will cause issues down the road, could we just… not?
Not doing it also causes issues in the form of micro stutters when some but not other frames have updated physics or not. Frame pacing is hard, and locking everything down happens to be the only sure-fire way to completely eliminate display issues. But then, of course, you have a locked frame rate.
They better delivery that “visual fidelity” if you are already capping at 30 fps on a current-gen console.
I imagine there is some reason we still see this. Any devs in the Industry lurking?
I’m not in the industry, but I’ve dabbled in Unity and that’s just kind of how it works by default. You create a game object and it gets an Update() function that is called once per frame. You’re encouraged to perform calculations and update it’s position in that callback.
You’re supposed to use
Time.deltaTime
to scale your calculations based on how long it’s been since the last frame.But that takes effort and it’s very easy to just not do that and your game will still work fine in most cases.
I think they fixed this with Fallout 76, so here’s hoping that those changes also made its way into their future projects.
Assuming that it will be ported to PC then I’m sure they’ve resolved those issues.
Is this really a thing lately? Maybe on some Switch games, but I think most modern games can have a dynamic framerate.
Skyrim famously did this. So the concern that Starfield could have similar issues is not unfounded.
One one hand, Skyrim was released in 2011, on the other hand Skyrim was also released in 2022
Remember Red Dead Redemption 2? On PC, your stats depleted faster the more FPS you had so with 60FPS you’d get hungry twice as fast as with 30FPS. Iirc even the sun moved faster so a day was only half as long.
In Dark Souls 2 the weapon durability was cut in half at 60 fps compared to 30 fps as well. They did eventually patch it though.
How does that work in multiplayer
Yeah that’s a weird choice. Todd Howard said that the game has performed upwards of 60fps in some places, but they made the choice to lock it down to 30fps on console for full graphical fidelity.
I get that not everyone has a TV that supports VRR, but they should be able to programmatically check what the xbox is currently supporting. If it is a Series-X and does support VRR they should be able to unlock FPS to up-to 60. I mean even 40fps on the steamdeck is surprisingly good, whereas 30 can be really jarring. Or give a choice to the user, 4k@30 or 1440p@60 with VRR.
I don’t understand it. If it’s a problem on console, why not have a full-fidelity “quality” mode but also offer a reduced-fidelity “performance” mode? Presumably there could be options like that similar to the PC build.
I just hope it plays 60+ on PC, who knows it could be pretty rough. I haven’t watched the digital foundry video on it yet, I assume it’s just Xbox version.
If there is a ‘mods’ system like Skyrim on Xbox, it should be possible to remove the frame rate cap. People managed it with Xbox before they added FPS Boost to Skyrim, using INI tweaks and a dummy ESP plugin. That’s without VRR, though.
Ya, I’d also like to see a 40fps mode. Really adds to the smoothness. Todd Howard suggested that they still had a decent amount of overhead, just not enough to hit 60 consistently. Would be nice if 40 became a new standard option, at least.
No lock 24 and add film effect
“Game dev here,” Carlone writes, adding that they are a “big fan” of Dreamcast Guy. “Wanted to clarify: it’s not a sign of an unfinished game. It’s a choice. 60fps on this scale would be a large hit to the visual fidelity. My guess is they want to go for a seamless look and less ‘pop in.’ And of course, [it’s] your right to dislike the choice.”
Sure. Maybe. It could be this. Or…
Arm-chair babbling idiot who plays too much video games here, I am one hundred percent convinced that it has nothing to do with visual fidelity and everything to do with that asthmatic engine they’ve been dragging since Morrowind. Can’t prove it but… you know. Just a hunch I get from playing their games.
People constantly complain about the engine that they use but no other game engine is as flexible when it comes to modding and no other game engine has the same level of complexity when it comes to being able to pick stuff up and move it around. You can take items off a shelf or desk in skyrim and fallout and stack them somewhere else. You can if you want decide to hoard a bunch of garbage you stole and stack them into a pyramid in your home base area.
Are their quirks? Sure the physics tied to framerate in skyrim was a problem, the games are always buggy, and they arent usually the prettiest games out there(though skyrim looked decent when it first came out and the graphical fidelity mods can work magic).
As for the premise does it have to do with fidelity? Of course it does. Setting a framecap on consoles means theyre able to use higher resolution assets, better lighting effects, and more complex models. I understand the preference of giving up fidelity for some smoothness and frames but 30fps isnt totally uncommon in console spaces and this is a bethesda game not a twitch shooter or a 2d fighter.
Outside the PC space gamers hardly ever talk about or think about framer rate. Graphical effects and details and fisual fidelity are a higher priority and more important in a game where generally you mostly just walk around and explore.
It would be nice if they had an option for a lower res mode or less detailed mode and 60fps target, but I get why they made the choice they did and ideally Im sure it’ll run at a normal framerate on pc.
Now if it runs poorly on PC then we can riot.
It’s also a personal choice of Bethesda not to rename their engine. Many other studios do this same thing and reuse engines, but they often rename them after significant rewrites. Bethesda just doesn’t do that.
Also they aren’t worried about how the game will be released. Their games have legs. So a 60fps version will eventually come out. Then they’ll release it 5 more times.
But they did? For Oblivion it was Gamebryo, for Skyrim it was the Creation Engine
I mean that they haven’t changed it from the Creation engine. Which has been used since Skyrim despite some big rewrites for Fallout and I’m sure more big rewrites or additions for Starfield
But it’s only been 2 games since Skyrim, right? And for Starfield it’s being renamed Creation Engine 2. Either way that statement “Bethesda just doesn’t do that.” Doesn’t seem accurate when they have done that multiple times.
Huh okay yeah that’s fair. I guess I’m thinking more about the time span since that game engine is now well over a decade old whereas the previous examples are separated by a handful of years. And I didn’t know about them putting a ‘2’ in front of it for Starfield.
I also agree with that. I love the modding aspect of it and I fear it’ll go away with a new engine.
If only players could make that choice themselves, perhaps through some sort of graphics settings menu. No, that’s crazy and unprecedented, it could never work.
No, it’s most definitely a choice. You can make any engine run at 60 FPS if you sacrifice something else for it. The RE engine runs beautiful games at 60 FPS, but they had to make all sorts of sacrifices to fidelity to get World Tour in Street Fighter 6 to run at all, let alone at 60 FPS on current gen consoles.
I mean sure but give us the choice, damn it! :(
Removed by mod
I no longer trust triple A games on PC and if the game is not ridiculously busted optimisation wise for PC at release I would be amazed.
That’s exactly why I mainly play on PC nowadays. I didn’t like PC gaming 10-15 years ago, but now I love being able to play at 4K60 / 1440p60 by downgrading settings I don’t care.
Call of Duty still runs on the Quake 3 engine, if we go off of the logic people uncharitably use for Bethesda’s games specifically.
Arm-chair babbling idiot who plays too much video games here, I am one hundred percent convinced that it has nothing to do with visual fidelity and everything to do with that asthmatic engine they’ve been dragging since Morrowind.
Code doesn’t go bad with time, that’s not really how it works. And game engines tend to be a Ship of Theseus situation, where just because it’s still the same “engine” in theory, doesn’t mean that large parts (or all of it), haven’t eventually been replaced or refactored over the years.
Unreal Engine has been around for 30 years at this point, would you also consider that an “asthmatic engine”?
No, what I mean is that this engine always had a cobbled up together with duct tape feels to it. It’s also the beauty of it.
Some engines get better and some just get more and more spaghetti duct tape.
that asthmatic engine they’ve been dragging since Morrowind
I don’t believe that’s true at all, though. At least by Wikipedia, Morrowind was NetImmerse, Oblivion was Gamebryo (modified Havok), and Skyrim was Creation. And I remember in the announcements for Skyrim that they remade the engine for the game. And Starfield is an updated engine, Creation 2
Gamebryo was called netimmerse until 2003. Creation is a modified gamebryo. So Creation 2 will also be based on it. So yes they use kinda the same engine since morrowind. Beteshda will not change away from it because gamebryo is a large reason why the modding community is as strong as it is for skyrim etc. And the modding community sells a lot of copies!
The engine also started as an engine for MMOs, which allowed them rich scripting for every NPC, as well as an inventory for every NPC.
The world fidelity that Bethesda builds, on a technical and simulation level, is unmatched — yeah, something like The Witcher 3 might look better, but it also doesn’t let you interact with basically every item in the world or pickpocket every NPC’s weapon as a way to neutralize them in combat.
Reminds me of Horizon Zero Dawn running at 30fps but it felt silky smooth because the FPS was rock solid.
30fps is normally alright for 3rd person adventure games, but shooting, especially first person, might feel different. Idk, I still don’t know how to feel about this one. The digital foundry guys seemed to be supportive, but I still just don’t trust Bethesda lately
Personally, frame rate is much more important to me than most other factors. If there’s the option to, I’m going to be cutting a lot to get it to 60, especially since I’m somehow way below specs with a 6600 xt
I decided to play Jedi Survivor at 30 on the PS5 just to sort of get the feeling of 30 and as I began playing, I"m like alright this is okay I’m loving the graphics and how awesome everything looks. Played for about 30 minutes like this at 30fps then decided alright I’m gonna toggle performance mode on and see how it compares now that I’ve experienced 30 and whew… it was a night and day difference. It felt so silky smooth, despite the fps drops in Survivor it still felt 100000% times better over 30. Just the smoothness and fluidity is insanely good. It was like I was going from slow motion to real life when I made the switch.
I really hope Starfield can feel good, but man being first person at 30 is gonna be rough I bet. I really hope I’m wrong and it’ll be decent though.
Ya, when switching between the modes, you can really feel the difference. Funnily enough, Jedi Survivor was one of the few games I actually preferred the 30fps mode. Even with VRR, the performance mode was just too inconsistent for me.
I would really like to see a balanced, 40fps mode on Starfield. If they can’t consistently maintain the 60fps mode, at least offer some choice for people who want a higher framerate, even if there are other concessions (ie. inconsistent performance).
It will be no different than playing Fallout 4 at 30fps on Xbox One in the past, so I don’t mind too much. I also play a lot of games on Steam Deck though, so I’m used to lower framerates. Will be playing Starfield on Xbox Series X though, with Game Pass.
yeah… if you have not ever played anything 60fps or more.
FPS itself is not as important as consistent and low frametime.
If frametime graph is pretty much flat the stuttering would be low and overall experience is nice, but if it’s janky one would like to drop the game or decrease quality settings pretty fast.
This incessant nagging about fps is the most tiresome thing in gaming since gamergate.
Yeah how dare consumers expect their products to be good
Good ≠ a single metric.
Sure, but a game is objectively better if it can run at a higher framerate.
Bloodborne is excellent, but it would 100% be better if it ran on solid 60 FPS.
Computers (including consoles) have limited resources, at some point you need to deal with tradeoffs, for example do you prioritize graphics quality or do you prioritize FPS? Do you want/need to have more resources available for the physics engine? That eats on the maximum possible FPS. Do you want to do real time procedural generation? Do you want to use the GPU to run some kind of AI? All this are design considerations and there’s no one size fits all prioritization decision for all videogames. Clearly the people working on Starfield believe that for their intended game experience graphic fidelity is more important than FPS, and this is a perfectly valid design choice even if you don’t agree with it.
It’s a matter of optimization and Bethesda games have all had pretty poor optimization. They could get it running at a higher framerate but there’s no need because people will buy it even if it runs 30fps.
If it was only a matter of optimization we would all still be playing games on the original NES.
Removed by mod
Every video game and every TV program for DECADES ran at 30fps. 29.97, actually. Nobody was motion sick or got eye strain.
Removed by mod
Just because you’re okay with 30FPS doesn’t make it “fine” or “good” either. Higher FPS is objectively better. Period. That means 30FPS is bad, when the other options is 60FPS (Or higher, because the console is being DIRECTLY MARKETED to the consumers as a 60FPS-120FPS console)
Nobody was motion sick or got eye strain.
Wow, I didn’t realize you could speak on behalf of everyone’s personal reaction to FPS
The difference is that TV and movies have a consistent delay between frames. That is often not the case with video games.
Most games of the NES, Genesis, and SNES era ran at 240p, 60fps (in the NTSC regions).
I agree up to a point. If a game is at 30 and feels good to play, then I’m OK. For example, Zelda feels great. Controlling Link is tight and snappy.
On the other hand, if the game has bad frame pacing (like Bloodborne), playing at 30 feels real bad.
I try not to get too crazy about frames, but sometimes some games just don’t feel good.
I will say, though, that while I really like channels like Digital Foundry, I sometimes wonder if them picking apart games to show the most minor frame dips is slowly teaching us to see these things, and as a result we kind of subconsciously will be like, “Well now I noticed this game had some moments where the frames dropped during an explosion. Obviously it’s a bad game.” I know that’s some hyperbole, but still.
It’s also heavily dependend on getting used to it. There are games that have a quality and a performance mode where I sometimes start to think I’m at 60FPS until I switch to the actual 60FPS mode and realize that it’s a completely different feeling. Switching back lets those 30FPS seem pretty bad. But if I didn’t had the possibility of switching between those two, I would’ve been happy with the 30.
But as you said it has to be rock stable. I played GoW Ragnarök on my PS5 and that Quality 30FPS mode was just terrible and felt like 20FPS. That of the Final Fantasy 16 Demo is better but here it’s the overdone motion blur that bugs me enough to wan’t to switch to the 45-60FPS mode where the blur is weaker
“…causing worry about the space sim on PC, but a God of War…”
wat?
My guess is that gunplay in any version of “creation engine” is going to be janky as …
30 FTS for this kind of game should not matter too much in world, but I agree that it’s pretty disappointing. But I’m extremely skepticial about the whole release anyway.
No Man’s Sky runs at a very stable 60fps, I personally know people who have wrangled it up to 120fps. I know they don’t have the same underlying tech, but they’re very similar in terms of gameplay (from what we’ve seen)
They’re a wildly different level of detail though. The NMS physics engine is pretty simplistic, mostly effecting NPCs and a very few physics objects. Starfield is like other Bethesda games, tons of little items and junk that all have their own physics and interactions.