>>105872256>sending the dGPU's frames twice through PCIe so that it can come back out the iGPU to hit your monitor is just like.. a secret trick to increase lagI play a lot of rhythm games, I tried project diva at 1440p and 4k and on both cases, with the HDMI hooked to either the iGPU or dGPU, I could still FC or nearly FC multiple hard and extreme songs. I didn't notice any relevant changes in input lag. But that game already runs perfectly fine regardless, I tried this set up just for games like cyberpunk and monster hunter rise where VRAM matters.
>good job sacrificing bandwidth, latency, and performance just to "free up 300MB"If that setup really fucked that much with bandwidth, I'd see a noticeable increase in texture pop-in and decrease in FPS + frame time spikes which is simply not the case. I also said how windows can take upwards of 1-1.2GB of your VRAM, not just 300MB. Hell, some lower mid range gpus releasing TODAY still can perform somewhat fine under pcie 3.
>also, fucking 16gb ram with a 5700G and RTX 3060? your iGPU is using the same pool of ram as the rest of your system. actual retardation. only a jeet falls this hard for the unused ram meme. i cant imagine how much youre hitting the swap other than like... 100% of the time the moment you load up any game??I think it might not be what you're implying here unless you're THAT stupid but the way you're saying this is almost like you're implying that the 16GB of RAM isn't enough because there's going to be a huge spike in RAM usage if the iGPU is the one handling the video output which is again, false. Picrel, I have cyberpunk open, firefox with 14 tabs, discord and other background processess and that's the pagefile usage (window's swap).
The problem with /g/ is that they don't know the difference between a system that uses caching quite often (macOS, windows) and a individual program that is really bloated and/or has a memory leak (which in that case no, more memory won't fix that issue).