Yeah, although I am always reluctant to quantify visual quality like that. What is "65% better" in terms of a game playing smoothly or looking good?
The PS5 Pro reveal was a disaster, partially because if you're trying to demonstrate how much nicer a higher resolution, higher framerate experience is, a heavily compressed, low bitrate Youtube video that most people are going to watch at 1080p or lower is not going to do it. I have no doubt that you can tell how much smoother or less aliased an image is on the Pro. But that doesn't meant the returns scale linearly, you're right about that. I can tell a 4K picture from a 1080p one, but I can REALLY tell a 480p image from a 1080p one. And it's one thing to add soft shadows to a picture and another to add textures to a flat polygon.
If anything, gaming as hobby has been a tech thing for so long that we're not ready to have shift to being limited by money and artistic quality rather than processing power. Arguably this entire conversation is pointless in that the best looking game of 2024 is Thank Goodness You're Here, and it's not even close.
Yep. The thing is, even if you're on high end hardware doing offline CGI you're using these techniques for denoising. If you're doing academic research you're probably upscaling with machine learning.
People get stuck on the "AI" nonsense, but ultimately you need upscaling and denoising of some sort to render certain tier of visuals. You want the highest quality version of that you can fit in your budgeted frame time. If that is using machine learning, great. If it isn't, great as well. It's all tensor math anyways, it's about using your GPU compute in the most efficient way you can.
I mean... OK, but AMD just revealed a new set of AI-powered upscaling libraries along with Sony for the PSPro and is on record saying they're backing out of high end gaming hardware to pivot to data center hardware, so... I hope you have more reasons than this, because I don't think they disagree.
Oh, now you're wrong. AI upscaling is demonstrably more accurate than plain old TAA, which is what we used to use in the previous generation. I am NOT offloading compelling NPC dialogue to a crappy chatbot. Every demo I've seen for that application has been absolutely terrible.
What do you mean "suddenly"? I was running path tracers back in 1994. It's just that they took minutes to hours to generate a 480p image.
The argument is that we've gotten to the point where new rendering features rely on a lot more path tracing and light simulation that used to not be feasible in real time. Pair that with the fact that displays have gone from 1080p60 vsync to 4K at arbitrarily high framerates and... yeah, I don't think you realize how much additional processing power we're requesting.
But the good news is if you were happy with 1080p60 you can absolutely render modern games like that in a modern GPU without needing any upscaling.
That's fine, but definitely not a widespread stance. Like somebody pointed out above, most players are willing to lose some visual clarity for the sake of performance.
Look, I don't like the look of post-process AA at all. FXAA just seemed like a blur filter to me. But there was a whole generation of games out there where it was that or somehow finding enough performance to supersample a game and then endure the spotty compatibility of having to mess with custom unsupported resolutions and whatnot. It could definitely be done, particularly in older games, but for a mass market use case people would turn on SMAA or FXAA and be happy they didn't have to deal with endless jaggies on their mid-tier hardware.
This is the same thing, it's a remarkably small visual hit for a lot more performance, and particularly on higher resolution displays a lot of people are going to find it makes a lot of sense. Getting hung up on analyzing just "raw" performance as opposed of weighing the final results independently of the method used to get there makes no sense. Well, it makes no sense industry-wide, if you happen to prefer other ways to claw back that performance you're more than welcome to deal with bilinear upscaling, lower in-game settings or whatever you think your sweet spot it, at least on PC.
I mean, both times it was a right-wing Trump voter who was disappointed on him. I think we've identified that the real threat here is undecided voters.
If the visuals are performant and consistent, why do we care? I have always been baffled by the obsession with "real pixels" in some benchmarks and user commentary.
I don't see how that's the case. Most people prefer more fps over image quality, so minor artifacting from DLSS is preferable to the game running much slower with cleaner image quality. That is consistent with the PS data (which wasn't a poll, to my understanding).
I also dispute the other assumption, that "we suck at optimizing performance". The difference between now and the days of the 1080Ti when you could just max out games and call it a day, is that we're targeting 4K at 120fps and up, as opposed to every game maxing out at 1080p60. There is no target for performance on PC anymore, every game can be cranked higher. We are still using CounterStrike for performance benchmarks, running at 400-1000fps. There will never be a set performance target again.
If anything, optimization now is sublime. It's insane that you can run most AAA games on both a Steam Deck and a 4090 out of the same set of drivers and executables. That is unheard of. Back in the day the types of games you could run on both a laptop and a gaming PC looked like WoW instead of Crysis. We've gotten so much better at scalability.
You can't just show pictures of me in the bathroom without permission. You'll be hearing from my lawyers.
@MudMan
@fedia.io