Anonymous
9/15/2025, 5:13:17 PM
No.720732882
[Report]
>>720732950
>>720733206
>>720734153
>>720734621
>>720734989
>>720736546
>>720739725
>>720741360
>>720742494
>>720746846
>>720752239
GPU Optimization / Not a Nintendo Thread
Honest question: how can a device that’s, at best, equivalent to a 3060 Ti run games at 1440p with DLSS on just 10 watts, while Nvidia’s latest 90-series card demands nearly 500W?
We’re talking about a 4900% increase in electricity use. Why are high-end cards consuming more power over time instead of less? Is this driven by developers, or by Nvidia itself? Could a 5090 eventually be optimized to run a game at 1080p on just 10W? What’s really going on with these cards’ power demands, and why isn’t anyone asking these questions?
It’s insane, we’ve gone from 350W on a 3090 to 480–575W on a 5090. What the hell is going on? This increase in wattage is seen across all cards except the 60's series too.
We’re talking about a 4900% increase in electricity use. Why are high-end cards consuming more power over time instead of less? Is this driven by developers, or by Nvidia itself? Could a 5090 eventually be optimized to run a game at 1080p on just 10W? What’s really going on with these cards’ power demands, and why isn’t anyone asking these questions?
It’s insane, we’ve gone from 350W on a 3090 to 480–575W on a 5090. What the hell is going on? This increase in wattage is seen across all cards except the 60's series too.