If fsr and dlss are any indication, new technology has the capability of improving that performance for lower end hardware. The problem we’re seeing now is that games are using dlss and fsr to “optimize” games instead of optimizing the game first then adding those technologies afterwards. I’m not saying every dev is doing this, but there are clear standouts where fsr and dlss are being used to pick up the slack for optimization, Starfield being the latest culprit.
This is really good information, but I just can’t get behind it I’m sorry. Things started to go downhill once dlss and fsr really started to take off and it’s shown in damn near every major PC port. Remnant 2 is another culprit, the game ran like trash, even when I was running it through an rtx 4080 on GeForce now. I couldn’t get it to go above 45fps even in 1440p on high settings without dlss in ward 13. This is becoming more commonplace with every release, baldurs gate 3 seems to be the only exception.
deleted by creator
If fsr and dlss are any indication, new technology has the capability of improving that performance for lower end hardware. The problem we’re seeing now is that games are using dlss and fsr to “optimize” games instead of optimizing the game first then adding those technologies afterwards. I’m not saying every dev is doing this, but there are clear standouts where fsr and dlss are being used to pick up the slack for optimization, Starfield being the latest culprit.
deleted by creator
This is really good information, but I just can’t get behind it I’m sorry. Things started to go downhill once dlss and fsr really started to take off and it’s shown in damn near every major PC port. Remnant 2 is another culprit, the game ran like trash, even when I was running it through an rtx 4080 on GeForce now. I couldn’t get it to go above 45fps even in 1440p on high settings without dlss in ward 13. This is becoming more commonplace with every release, baldurs gate 3 seems to be the only exception.
deleted by creator