• 0 Posts
  • 29 Comments
Joined 1 year ago
cake
Cake day: July 3rd, 2023

help-circle
  • jaycifer@kbin.socialtoGaming@lemmy.worldHelp
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    I think it would be tough to nail down one thing. There are the clear comparisons to Victoria 2, which I haven’t played, but my understanding is that 2 is more “detailed” in it’s simulation of some things. There will always be people who don’t like changes from the last game. The military aspect is a lot less engaging than something like Hearts of Iron, but I think the intent there was to keep the focus on the economic and political sides of things. Warfare received a minor overhaul when I first tried the game that I’ve heard made things better, but it can still be a little frustrating at times.

    Most of the complaints about the economic side that’s meant to take center stage is that your economy’s success boils down to how many construction points you can have going at once. That’s true, but I do like that you can’t pour everything into that without balancing the foundation needed to support the increase of construction, and just doing that could limit growth in other areas, like improving citizen lives, which could complicate your political affairs.

    I feel like I’ve gotten a little lost in the weeds here. Overall, I think it has mixed reviews because Victoria 3 is still a work in progress. It’s a work in progress that I enjoy very much, but there is still room for improvement. I kind of fell off Stellaris between the Nemesis and Overlord expansions because it felt kind of bloated and repetitive, and I wasn’t wondering what kind of civilization I could play anymore. Victoria 3 has been successful at making me contemplate how I can manipulate the mechanics to achieve a specific outcome, even when I’m not playing.


  • jaycifer@kbin.socialtoGaming@lemmy.worldHelp
    link
    fedilink
    arrow-up
    2
    ·
    8 months ago

    With menu games like Paradox make, you gotta learn by playing the game. And by playing the game, I of course mean pausing the game every minute or two to spend way more minutes reading the tooltips, the tooltips within those tooltips, and then finding your way to a new menu you didn’t know existed referenced by those tooltips so you can read more tooltips!

    It’s a beautiful cycle, and Victoria 3 has sucked me in as much as Stellaris did 7 years ago. If you have any questions or thoughts, I’d love to hear them!


  • jaycifer@kbin.socialtoGaming@lemmy.worldHelp
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    With menu games like Paradox make, you gotta learn by playing the game. And by playing the game, I of course mean pausing the game every minute or two to spend way more minutes reading the tooltips, the tooltips within those tooltips, and then finding your way to a new menu you didn’t know existed referenced by those tooltips so you can read more tooltips!

    It’s a beautiful cycle, and Victoria 3 has sucked me in as much as Stellaris did 7 years ago. If you have any questions or thoughts, I’d love to hear them!




  • What they didn’t mention is that Baldur’s Gate is a Dungeons and Dragons franchise. DnD is magnitudes more popular than it was when BG2 released, to the point of being at worst nearly mainstream. What has sold people on BG3 is being able to play their tabletop game in video game form.

    I do think Larian’s pedigree and the Baldur’s Gate name were contributors to its success, but if there was one driving factor it’s the brand recognition of DnD with the marketing of an AA to AAA game.








  • Saying you were 13/14 when horse armor came out doesn’t help your case arguing against their comment. It just means you were prime gaming age when dlc, map packs, and smaller content were replacing larger expansions. The acceptance of those (which based on your demographic you probably did accept) made it easier to transition to more and more egregious micro transactions.

    There used to be (maybe still are) complete games released on mobile. They usually cost $6.99 and didn’t need more. If they want Elden Ring on mobile without tarnishing its reputation, they could sell a complete experience for $10 or $15 since it’s been a decade since those $6.99 prices. That’s what Elden Ring was and it was widely praised. That’s what the rest of their games have done and that has turned out well for them.

    There may be servers for the multiplayer, but based on the fact none of the other From Soft games charged for it the cost must be minimal.





  • I remember playing Assassins Creed II on pc with a 9500GT and getting sub 20fps constantly to the point I had to wait for character animations to catch up with the dialogue so the next person could talk. Halfway through the game I upgraded to a GTX 560 and was astounded that everything was in sync and oh so smooth. I always remember that when I start getting annoyed I can’t get over 90fps in a game. As long as it’s playable!


  • For me it depends on the game. A menu game from Paradox like Crusader Kings? 4k 60fps. A competitive shooter? Ideally the max resolution (for greater pinpoint accuracy) and 144fps, but between the two I’d want maximum fps for the reaction speed and responsiveness. A pretty game like Ori and the Will of the Wisps? Crank the graphics up and I’m happy with 60fps.


  • That article states people can perceive images as rapidly as once every 13 milliseconds, which they math out to 75 fps, 25% higher than 60.

    Looking at the study itself, they were testing whether participants could pick out a picture that displayed for 13-80 ms when “masked” by other brief pictures, with a focus on whether it made a difference if the participant was told what image they were looking for before or after seeing the images. What they found was that participants could pick out the image as low as the 13 ms mark (albeit with less accuracy) and could generally do so better if told what to look for beforehand.

    What this tells me is that your source has nothing to say about anything over 75 fps. It also was testing in a fundamentally different environment than a video game, where your brain will constantly expect an image similar to and stemming from the image before it rather than seeing a completely different image. If you were to draw conclusions based on the study despite the differences, what the study would suggest is that knowing what to look for, as your brain does gaming, would make you better able to pick out individual frames. This makes me want to think that your source does not support your assertion, and that in a game you could perceive frame rates higher than 75 fps at a minimum.

    From my own knowledge, there’s also a fundamental difference between perceiving reality and computer screens in the form of motion blur. Objects moving in real time will leave a faint blur behind when perceiving it that your brain can use to fill in any blanks it may have missed, making reality appear smoother than it is. For an example of this wobble a pencil back and forth to make it “bend.” Movies filmed at 24 fps capture this minute motion blur as they film which makes it easier for our brains to watch them despite the lower frame rate. Real time rendered video games do not have this effect, as there are no after images to fill in the blanks (unless you turn on motion blur, which doesn’t do a good job emulating this).

    This means video games need to compensate, and the best way to do that is more frames per second so your brain doesn’t need to fill in the blanks with the motion blur it’s used to seeing in the real world. You’ll obviously get diminishing returns from the same increase, but there will still be returns.