← Home ← Back to /v/

Thread 714869996

52 posts 12 images /v/
Anonymous No.714869996 >>714870531 >>714871227 >>714871262 >>714877493 >>714878823
Have you been exaggerating your stutters lately?
Anonymous No.714870107
i will keep upgrading my cpu until windows 11 no longer stutters in vmware
Anonymous No.714870531 >>714870876 >>714871227
>>714869996 (OP)
i read the article and i think its another one of those dudes that turn on an all the smear technology (tm) and then run games at 60 fps. at that point it's no wonder you're not hitting CPU bottlenecks. but if you are going for 144 fps then you will suffer microstutters in modern games if you don't have a high end CPU. and also i think the author doesn't really appreciate the fact that if you're CPU bottlenecked even 1% of the time, then your FPS will fluctuate and hurt the feel of the game a lot
Anonymous No.714870876 >>714871417 >>714872190
>>714870531
There's no reason not to get a high-end CPU. They aren't as outrageously expensive as GPUs these days; and they stay relevant for much longer; not only that, they benefit you in day to day tasks. Even uncompressing your dolphin porn will be faster with a good CPU.
Anonymous No.714870895 >>714872257
>I don't play CPU-demanding games so that means they don't exist
behead game/tech "journalists"
Anonymous No.714871187 >>714871569
Stuttering isn't caused by a bottleneck, THOUGH.
Anonymous No.714871227
>>714869996 (OP)
>>714870531
It's a XDA article. It's AI generated slop. The guy claims that stutter is due to your SSD and not turning on XMP which is AI retard basics.
Anonymous No.714871262 >>714871349
>>714869996 (OP)
Some times I get like a sub 1 second stutter once after like 5 minutes of playing then it never happens again I don't get why
Anonymous No.714871349 >>714871540
>>714871262
windows and other programs like to execute background tasks without telling you. search indexing, defragging, trimming, autoupdating,syncing clock, etc.
Anonymous No.714871417
>>714870876
yeah

i don't think you should ever really take money out of your CPU fund in order to pay for a GPU. buying a 2.5k GPU and a 200 dollar CPU is insanity when you could buy a 2k GPU and a 500 dollar CPU and have a considerably smoother experience in a huge chunk of badly programmed games (almost all of them nowadays)

however i think the main reason people don't do that is because
>people just stare at the nominal framerate, which hides inconsistent frametimes
>people can't identify the game is actually running badly, and instead think there's something intrinsically wrong with how the game plays (floaty aim, input lag, etc.)
>people identify the game is running badly, but they check their CPU usage and see it at 20% and conclude it's because of a GPU bottleneck, when in reality it's the main CPU thread being choked out by some retarded script
Anonymous No.714871540 >>714872347
>>714871349
I should say that it's every time I play a game, I guess it's just less noticeable on the desktop if that's the case?
Anonymous No.714871569
>>714871187
stuttering is usually called by the worst bottleneck of them all: disk access
Anonymous No.714871638 >>714872647
I upgraded to a 9800x3d solely because i got tired of fps dropping from 130-140 to 60ish in illuminate city maps in helldivers 2.
Anonymous No.714872090
fun fact : most games still do most of their work using a single core, and thats down to the developers and engines.
a rare few use four cores to their full potential and NONE use all 8 cores.
Anonymous No.714872190 >>714873441
>>714870876
eh, idk about this. i think the cpu that you get really should depend on your budget and your needs. the difference between a 9600 and a 9800x3d is $250. If youve never really cared about modern AAA stuff and really only game and browse the web you're not gonna see $250 worth of difference and that money would be better put towards a better GPU. if you wanna step into the high end and look at stuff like hte 7950x3d/9950x3d theres about a $400 difference between those and the 9800x3d. unless you're really into productivity or regularly do things that actually need those cores there is zero reason for most people to get those chips.

ontop of that if you dont have a really good gpu you're not gonna make the full potential out of your high end cpu. i have a top of the line cpu but paired it with a mid range gpu (long story). it kicks ass in all the productivity shit i do, but compared to friends systems or backup systems that have a much weaker cpu but comparable gpu the performance difference isnt really that staggering - certainly not enough to justify the msrp price difference.
Anonymous No.714872257 >>714878932
>>714870895
Most CPU-demanding games are just badly optimized games honestly. You getlike 30% CPU usage and the fps just refuses to go up.
Anonymous No.714872347
>>714871540
games are all borderless window now. when a background app takes focus away from the focus it can appear as a stutter since thats freesync/gsync going on/off.
Anonymous No.714872647 >>714872998 >>714873202 >>714876187
>>714871638
9800x3d makes a big difference in some games
Anonymous No.714872998
>>714872647
>Wow screenshot at a different point in fight with variables
Anonymous No.714873202 >>714876347
>>714872647
>turn off addons
>gain 100+ fps
that was hard
Anonymous No.714873441 >>714873743 >>714875168 >>714876234
>>714872190
all i have to really say here is that enough games regularly choke the main thread that not buying an overkill CPU can really screw you over. in many games you may have a 5600x and you will never go below 5ms frametime, but in some that's not the case. a lot of those issues aren't even reported or properly identified.

for example marvel rivals has some unreal blueprint script that runs at a fixed 60 tick rate, because no matter how high my framerate was (120, 140, 160, whatever) it would always look like i actually had 60 to 90 fps. basically when your frametimes look like this:

>3ms
>3ms (monitor refresh)
>3ms
>3ms (monitor refresh)
>16ms (monitor refresh, monitor refresh, monitor refresh)

you end up with a bunch of "wasted" frames that are in no way displayed to you but still show up added to the framerate. and the final 16ms frame gives you a microstutter due to 3 refreshes being on the same frame.

helldivers 2 is another game where it seems impossible to get above 110ish FPS because even a 9800x3d just can't push it above that due to all the AI scripting and stuff. and again, if you play helldivers 2 with a weaker CPU but a strong GPU, the game will show that you have 120 fps but you are actually getting more like 70-90 because all the frames being discarded between monitor refreshes.
Anonymous No.714873743 >>714873817
>>714873441
You mean you can't see higher fps with your eyes. Ok.
Anonymous No.714873817 >>714874206
>>714873743
ok youre so out of your element here that i wont even bother
Anonymous No.714874206 >>714875101
>>714873817
You can just from the mouse movement in menus that the fps does in fact increase... Or spin your camera around. Just because the AI doesn't operate in 1000hz tickrate doesn't mean you don't get more visual fluidity from higher refresh rate.
Anonymous No.714875101 >>714875781
>>714874206
>Just because the AI doesn't operate in 1000hz tickrate doesn't mean you don't get more visual fluidity from higher refresh rate.
we're not talking about the same thing.

i'm explaining that if you render 119 frames in the first 0.5s, but then a very laborious AI script holds up a frame for another 0.5s, then you will "have 120 fps" but you will have only seen 60 different frames on your monitor, right? if you miniaturize that issue to a much smaller scale, you get chronic microstutter, which doesn't show up in your FPS counter but is visible if you look at the game itself (or install software that counts actual different monitor refreshes)
Anonymous No.714875168 >>714877035
>>714873441
and all im saying is it really isnt as big of a deal as you think it is. ive personally run 1700 vs 2700 vs 3600 vs 5700 vs 5800x3d vs 7700x vs 7800x3d, vs 7950x3d, 9700x vs 9900x vs 9950x on the amd side. Yes obviously the higher end chips perform better and obviously you start to see massive jumps in performance when you move up a few generations. That being said sometimes those jumps arent nearly as much as you think and its the gpu that makes a significantly bigger difference. Going from the 5700x to the 7700x in gaming performance alone is not worth the cost of the platform change. The real world difference you'll see shelling out the extra cash for a 9800x3d vs a 7700x isnt worth it and the extra $250 or so would give you much more by bumping you up a gpu class.

If you have the money then sure, buying a better cpu will never hurt and will potentially stave off future updates - but its usually wasted money for a lot of people who don't actually need the extra horsepower. For fucks sake I was able to play starfield and a very playable 30-45fps (depending on the zone) on an i2500k + 3070. In almost every instance of gaming, unless you're really into grand strategy or shit like stellaris the gpu is going to be doing way more for you.

Pair all that with the fact that the average gamer plays whatever is on sale on steam, browses the internet, and streams video - and does essentially nothing else with their PC, going balls to the wall on a cpu is just not needed.
Anonymous No.714875781 >>714876436
>>714875101
I think that you could have explained that easier. I just recorded 120 fps clip in Helldivers 2 and yes actually just about every 3rd frame is a duplicate. It is still smoother than with lower fps but it's like there's diminishing returns because you don't gain full fps which is.. weird.
Anonymous No.714876187
>>714872647
it makes a difference in every game that is programmed by monkeys, which is 98% of the games out of there. It's always been like that.
Anonymous No.714876234
>>714873441
I had this in Hell Let Loose once, every country I have was telling me it's running at ~200fps but looked like 30.
then suddenly it would recover and look good again.
Anonymous No.714876347
>>714873202
but i need all those addons to raid
Anonymous No.714876436
>>714875781
>I think that you could have explained that easier.
yeah i have had a hard time mapping this issue out in my head so its kinda my bad
Anonymous No.714877035 >>714877828
>>714875168
nta but just fyi marvel rivals drops my fps to 40-50 in the middle of big fights entirely because of the cpu logic with a 2700x, which is the bottleneck in general
Anonymous No.714877493
>>714869996 (OP)
One game that doesn't work on my cpu is starfield. Everything else works fine.
R9 5900X
Anonymous No.714877828 >>714878328 >>714879271
>>714877035
sure, but thats also a 7 year old cpu at this point playing a modern, albeit less demanding game. 40-50fps is still in the realm of perfectly playable and the person i was replying to was implying that someone like you would see significant benefit and absolutely should be shelling out the extra $200 for a 9800x3d instead of just grabbing a 9700x if you were to upgrade to am5 right now.

cpu bottlenecks are real. in my starfield example when i ran it with that same gpu but on my 7950x3d, magically the framerate was 60-90 and the bottleneck was the gpu at that point - but that was also jumping like 12yrs in processor tech and going from 4 to 16 cores. comparing results from a 5800x3d to the 7950x3d mattered a lot less for gaming and was a lot more gpu dependent. the 2500k had no business trying to play starfield but it was able to do it because the gpu was doing most of the work. thats why my point was you should get what you need, put the savings towards the gpu which will net you much more tangible gains, and only then upgrade to a higher cpu if you have the budget leftover for it and really want to put it towards that. most people arent going to get their moneys worth by going overkill on their cpu if their primary focus is gaming.
Anonymous No.714878328 >>714878604
>>714877828
Do you understand what cpu logic is? There is nothing in marvel rivals that is more demanding logic wise than an online fps made 20 years ago. There are direct comparisons possible with games like team fortress classic.

It's time for you to come to terms with how this works on the dev side. Those incompetent faggots shit out terrible code then 1 month before release they correct a couple of the most egregious bugs until they hit 60 fps on a modern rig and call it a day.

I bet I can run ow1&2 with 3 or 4 times more average fps and no cpu bottleneck simply because blizzard usually has semi competent programmers
Anonymous No.714878604 >>714879048
>>714878328
yes, i understand what cpu logic is and that doesnt change my core point that saying people should get a top of the line cpu as a blanket recommendation is retarded.
Anonymous No.714878823 >>714880747
>>714869996 (OP)
I don't remember the last time a game stuttered.
>t. 3700x
Anonymous No.714878932 >>714879046
>>714872257
You have to look at it core by core. Games won't use all your cores, you see the 30% load total when in reality all the cores used by the game are under maximum load.
Anonymous No.714879046
>>714878932
= lazy optimization
Anonymous No.714879048 >>714879406
>>714878604
People should 100% get a high end cpu every time, as proven by the last 25+ years of video games on PC
Anonymous No.714879271
>>714877828
>40-50fps is still in the realm of perfectly playable
bro you will get your shoes taken from you in marvel rivals if you dont have at least 100 fps
Anonymous No.714879406
>>714879048
There's no reason to deal with housefires for insignificant performance gains next to mid-high.
Anonymous No.714880747 >>714881049 >>714881363
>>714878823
Then you don't play modern games. 3700x can't handle MH Wilds, Rise of Ronin or Assetto Corse EVO.
Anonymous No.714881049 >>714881857
>>714880747
Nothing can handle Milds.
Anonymous No.714881363 >>714881950
>>714880747
I've put an unfortunate amount of time into Wilds and regret playing such dogshit. No stutters.
Anonymous No.714881857 >>714881926 >>714881940
>>714881049
I get 100fps average at 4K on my 9800X3D/5090 PC.
Anonymous No.714881926
>>714881857
>only 100fps on the best pc money can buy
Anonymous No.714881940 >>714882173
>>714881857
Now turn off frame gen, skin color of shit.
Anonymous No.714881950
>>714881363
What gpu, resolution and how much fps did you get? I did play it at 1440p when I had 3700x coupled with 2080ti and the performance was miserable. Switching to 5700x3d and 3080ti improved it a lot, I'm hitting 60fps almost at all times now.
Anonymous No.714882173 >>714882253
>>714881940
No. Capcom says it's required, so I leave it on.
DLSS upscaling and framegen are mandatory for modern games. Get used to it.
Anonymous No.714882253
>>714882173
No one ever will, skin color of shit.
Anonymous No.714882305
>100 frames
>with framegen on