>>713352371I'll just copypaste the old post I wrote:
His Jedi Survivor video is a textbook example.
He focuses on the depth pre-pass which takes up 0.63ms, that totals only about 3% of the frame. Reducing that to half its time by tuning down some sampling would (if you were previously running at 60fps) only improve the framerate to 61 fps. There are way way bigger issues with Jedi Survivor's performance than getting 3% more performance from the GPU, a lot of which is overzealous and stupid use of Blueprints even for things as simple as character pupil contraction.
Yet he pretends passes and drawcalls are somehow the game's big performance problem and rants about quad overdraw even though it's completely unrelated to the game's issues. He talks about the 1700 drawcalls killing the performance but without actually measuring their CPU cost it's a moot point, as they take very little time to execute on the GPU - hell, you can SEE by the graphs he shows in his own video that all of the 1700 drawcalls are completely invisible on that graph so he's not showing any proof of high performance cost. He's just relying on his angry-confident tone to sell the story.
Jedi's actual performance issues on the consumer end are caused by UE4's poor asset streaming tools and the unfinished PSO caching, also its awful usage of Unreal's anti-merit Blueprints system. Almost none of this manifests as bad GPU-side optimization, it's needlessly heavy CPU load and PSO stutters. But hey that's a bit more complex to access than just fucking around with the console or the UE editor or shitty engine.ini tweaks so how would he know?
His videos are like this all the time. His starting points are 100% correct (shit optimization to due to bad development priorities and lack of dev talent) but his videos fall apart when he tries to get into the nitty gritty. His supposed "fixed TAA" is also silly because he essentially just recreated FXAA.