"CPU doesn't matter when gaming at 4k"

  Переглядів 55,444

Daniel Owen

Daniel Owen

Місяць тому

Check out Jawa, the place where gamers buy, build, and sell! bit.ly/JawaOwenApril24 Use code OWEN10 for $10 off your first purchase!
Does CPU matter when gaming at 4K? There has been common wisdom floating around the PC gaming space that CPUs don't matter much when gaming at higher resolutions. And while from a certain perspective this is true, I think that in realistic use cases it is much less true.
Sources:
HUB CPU video: • How Slow Is The Ryzen ...
GN Dragon's Dogma 2 video: • Dragon's Dogma 2 is a ...
What equipment do I use to make my videos?
Camera: Sony a6100 amzn.to/3wmDtR9
Camera Lens: Sigma 16mm f/1.4 amzn.to/36i0t9t
Camera Capture Card: Elgato CamLink 4K ‎amzn.to/3AEAPcH
PC Capture Card: amzn.to/3jwBjxF
Mic: My actual mic (AT 3035) is out of production but this is a similar mic (AT 2020) amzn.to/3jS6LEB
Portable Mic attached to camera: Rode Video Micro amzn.to/3yrT0R4
Audio Interface: Focusrite Scarlett 2i2 3rd Gen: amzn.to/3wjhlad
Greenscreen: Emart Collapsable amzn.to/3AGjQXx
Lights: Neewar Dimmable USB LED amzn.to/3yw4frD
RGB Strip Backlight on desk: amzn.to/2ZceAwC
Sponsor my channel monthly by clicking the "Join" button:
/ @danielowentech
Donate directly to the channel via PayPal:
www.paypal.com/donate?hosted_...
Disclaimer: I may earn money on qualifying purchases through affiliate links above.

КОМЕНТАРІ: 849
@danielowentech
@danielowentech Місяць тому
Check out Jawa, the place where gamers buy, build, and sell! bit.ly/JawaOwenApril24 Use code OWEN10 for $10 off your first purchase!
@connor040606
@connor040606 Місяць тому
As a Jawa Verifed Seller, I will say there is some incredible value in to be had in purchasing a prebuilt from a good seller on the platform.
@kaimojepaslt
@kaimojepaslt Місяць тому
never sekk in java, prices is joke, like you give it for free.
@korinogaro
@korinogaro Місяць тому
I like your videos but this is complete miss without DLSS results to compare. You just make an educated assumption. But DLSS is not free performance. It still impacts GPU's performance. I am not saying you are wrong, I just can't say you are right. And I can't test it myself as I don't have 2 CPUs. It would be nice for you, if you have 2 CPUs to make a follow-up video where you get same PC, different GPU and test one scene for native/upscalled. Than we could clearly see proportional impact of using DLSS.
@THU31
@THU31 Місяць тому
@@ingamgoduka57 Late April Fools comment? UE and Unity are literally the two most CPU-limited engines out there, with tons of other issues like #stutterstruggle. 😄
@ingamgoduka57
@ingamgoduka57 Місяць тому
But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance. Capcom must fix or ditch their game engine.
@zozihn8047
@zozihn8047 Місяць тому
Let me just pair a Pentium with a 4090. It will be awesome.
@Dajova
@Dajova Місяць тому
I just checked up the prices of them... oh boy, they arent even cheaper than a 12400F and have lower clockspeeds to boot xD
@khursaadt
@khursaadt Місяць тому
Amen brother
@doctorspook4414
@doctorspook4414 Місяць тому
As one wise Al Yankovich once said, "it's all about the Pentiums!"
@TwinkleTutsies
@TwinkleTutsies Місяць тому
glorious 4K at 1 FPS
@plains69
@plains69 Місяць тому
😂
@Lackesis
@Lackesis Місяць тому
CPU handle your 1. Render targets 2. Physic related 3. NPC AI related 4. Date streaming related CPU definitely matters even at 4K gaming. you will get stutter or worse 1% low if CPU can't handle those things well.
@saricubra2867
@saricubra2867 Місяць тому
CPU = Handles the gameplay or game feel. GPU = How pretty your game looks. I take gameplay over graphics any day, so i always overspend on CPUs.
@ingamgoduka57
@ingamgoduka57 Місяць тому
But GPUs are miles better in running Physics just use Nvidia PhysX which is awesome & cross platform. But modern game developers are not using PhysX which is a missed opportunity. Rockstar games have their own GPU based Physic Engine called Euphoria its been used from GTA4 to their latest games. You can also use compute shaders for some physics but modern game developers are lazy. The only thing that kills CPUs is highpoly animated characters, too many different materials & shaders. But modern good game developers use shader vertex animation for NPCs which only uses GPU & change into skinned mesh when you start interacting with the NPCs GTA to Spiderman NPC. Cyberpunk implement same thing in their updates which drastically improved performance.
@Argoon1981
@Argoon1981 Місяць тому
@@ingamgoduka57 Nvidia PhysX, even on Nvidia, do not run everything on the GPU. I know this is hard to believe for many but is the truth, I know this for two reasons, one I was a old Ageia PhysX user, before Nvidia bought it (meaning I add a Ageia PPU...) and I'm a game developer (a modder to be exact) my self for more than 20 years now and I have worked with PhysX (and other physics engines). Nvidia passed this wrong idea about what PhysX can really do, on purpose, just to sell you more GPU's and they won, because the majority of people/gamers, are totally mistaken about what GPU PhysX acceleration really does in games. In reality the only physics a GPU can accelerate, in real games, is the non gameplay impacting physics. Meaning soft body physics like, cloth movement and destruction, liquids, dynamic fog effects, particle based physics, ex. glass breaking peace's or similar effects and rigid body objects, that can't affect the AI or the player, in any meaningful way. Physics that can affect the AI or the player, meaning those that can kill/block the AI or the Player, is always run on the CPU, no matter what. For example, rigid body objects that you can pickup, with a gravity gun like in HL2 and shoot at NPC's to kill them, that is always run on CPU. Bullet detection/collision for shooters, is always run on the CPU. Collision detection for AI navigation, is run on the CPU. Collision detection for audio, is run on CPU. Ray tracing for gameplay, is run on CPU, etc. There's plenty of physics, in real games that exclusively run on CPU's, even on Nvidia, because there's no other way. Yes Nvidia has demos, where they have GPU's, running a bunch of rigid body objects around that look just like those moving around in games, but those demos, have zero AI, is all visual flare, the minute you put AI in the scene and the physics objects need to hurt the AI the physics have to switch to the CPU. And just for those that may wonder, this was true with the Ageia PPU as well, in reality the PPU was just a low power ASIC, with a custom processor with a bunch of tiny cores, very similar to a GPU.
@vindicator879
@vindicator879 Місяць тому
Also the more amount of unique NPCs in a game, the higher the CPU load
@Adri9570
@Adri9570 Місяць тому
Analogy to understand the relation between CPU and GPU: - GPU = orange - CPU = orange juicer - Objective: get quickly the highest amount of juice from the orange (highest possible FPS) = get the biggest possible orange (higher GPU) but don't forget that the best orange juicers (better CPUs) are the ones that get more juice from the same orange and avoid the worst orange juicers that can't even penetrate the orange (very low CPUs).
@Louis_Bautista
@Louis_Bautista Місяць тому
Yup, just upgraded from a 2070 Super to a 4070 Super with the intention of playing at 4K and after booting up Cyberpunk I realized that my Ryzen 3600 was clearly holding back my GPU. I decided to do a full upgrade to a 7600 system and now everything's alright. Turns out ray tracing takes a big hit on the CPU, even if you're playing at 4K
@lenscapes2755
@lenscapes2755 Місяць тому
I agree. I had 3600 as well and it's not a very good CPU to play with ray tracing effects. I'm still using 2070super and even it was getting bottlenecked by 3600 for raytracing. Now I'm using 13900K and it's buttery smooth.
@EliteGamingG786
@EliteGamingG786 Місяць тому
How is 4070 super holding in 1440p IAM planing to buy one but kind of nervous
@X_irtz
@X_irtz Місяць тому
@@EliteGamingG786 You can just... look up benchmarks online...
@42cuba
@42cuba Місяць тому
LOL even a 7600 isn't good enough
@wertyuiopasd6281
@wertyuiopasd6281 14 днів тому
Yes but the good news is that a 7600 can max out GPU at 4k. While at 1440p or 1080p, a 7800x3d will do a better job.
@ZackSNetwork
@ZackSNetwork Місяць тому
I saw a person pair a ryzen 5 1600x with a RX 7900xtx and argue that there was no CPU bottleneck.
@kotztotz3530
@kotztotz3530 Місяць тому
Oof, Zen 1 was really bad for gaming. Zen 2 was better, but I think Zen 3 is when AMD actually started competing with Intel in gaming CPUs.
@atirta777
@atirta777 Місяць тому
I mean it's very possible in a few games at 4K Native like Doom Eternal or even RDR2, but the combo isn't very reliable😅.
@nathangamble125
@nathangamble125 Місяць тому
@@atirta777 I don't think those two games are very good examples. Neither of them is GPU-intensive enough relative to their CPU usage, and I expect an RX 7900 XTX would still be significantly bottlenecked by a Ryzen 5 1600X. I'd pick a different example like Cyberpunk 2077 with ray tracing enabled, which is so GPU-intensive at 4K (especially for AMD GPUs, which aren't great at ray tracing) that an RX 7900 XTX actually _wouldn't_ be bottlenecked by a Ryzen 5 1600X.
@atirta777
@atirta777 Місяць тому
@@nathangamble125 Doom and RDR2 are good examples don't you think? The R5 1600X drives nearly 200fps on Eternal and I doubt 7900XTX does better than that with maxed RT on. Same with RDR2, about 90fps and 7900XTX doesn't reliably hit that at 4K Max. Also, we're talking about playable experiences here, which isn't the case with Cyberpunk 4K Native + RT on the 7900XTX.
@conyo985
@conyo985 Місяць тому
Tell that person that he can upgrade to a 5700X3D for the cheap.
@cks2020693
@cks2020693 Місяць тому
If you wanna play any RPG games with open world NPCs like animals, townspeople, or strategy games with million units and calculations, CPU is almost always the bottleneck
@grynadevilgearxd2442
@grynadevilgearxd2442 Місяць тому
True. Yesterday i tried Witcher 3 on both dx11 and dx12 on My i5 12600K, RTX 4070 different setting, resolutions and still have large frame drops on Novigrad.
@Ravix0fFourHorn
@Ravix0fFourHorn Місяць тому
@@grynadevilgearxd2442Witcher 3 next gen runs really badly because they hacked together a dx12 layer instead of making it run natively. If you try the Classic version you will have 0 framedrops in busy areas with your system.
@saricubra2867
@saricubra2867 Місяць тому
@@Ravix0fFourHornWhat if we use Vulkan instead?
@grynadevilgearxd2442
@grynadevilgearxd2442 Місяць тому
Yeah I know😀 because i'm playing on dx11. I just mean that Novigrad is more CPU bound with that many NPC.
@ogaimon3380
@ogaimon3380 Місяць тому
@@grynadevilgearxd2442 nah witcher3 npc barely have any logic,they legit just block of texture that have 1 line and move around or never,witcher get high fps even on a 4core cpu xd,now rdr2 is a better test,days gone is also heavy cpu limited in town my 4core can barely output 30-40fps
@atirta777
@atirta777 Місяць тому
Good 3 steps to use when building a *gaming* PC: 1. Pick a GPU according to the fps and resolution you like to play at in your games of choice. 2. Pick a CPU that achieves *at least* that same framerate. 3. Scale both to your budget while also considering longevity.
@mertince2310
@mertince2310 Місяць тому
Can you even explain what "cpu according to the framerate you like to play at" means? What would you recommend to a person who "likes" 60 fps? Or 144 fps?
@aohjii
@aohjii Місяць тому
2 steps when building a PC 1. Buy the fastest CPU in the market 2. Buy the fastest GPU in the market
@QuadsMid
@QuadsMid Місяць тому
@@aohjiiYou forgot the first step Step 1: Get a job
@mickieg1994
@mickieg1994 Місяць тому
Respectfully disagree, the frames per second you might see on a chart will almost certainly not be your real world user experience. By the time you have a bunch of programs running in the system tray, a browser with 20 tabs open, discord and anything else you might have running in the background this will not be your experience, especially if you're like me and watching a 1080p or even 4k movie on a second screen while gaming. Typically I would recommend buying a CPU that is more powerful than you think you need, at the moment I'd say 8 cores minimum and scaling the GPU based upon your budget after that, temper your expectations and if you're not happy with your parts choice for your budget then wait and save more. It's always better to have a CPU that's more powerful than the GPU for so many reasons, the smoothness of operation is what you will notice day to day, gaming or not.
@Bballfan1992
@Bballfan1992 Місяць тому
@@mertince2310basically he just means you have to balance your system. The gpu should almost always be your limiting factor in gaming. Hence why you spend 50% of your budget on the gpu. you choose the cpu that will not be the limiting factor. Obviously a 12400f and a 4090 is crazy. The cpu is the limit. Something like the 14700k is much more in line with the power of the 4090 and is a good balance.
@hollisbostick2872
@hollisbostick2872 Місяць тому
Very well explained. Thank you very much for helping me understand this issue better. It's why I'm a member of your channel. I'm happy and proud to support such stellar work. I appreciate you🙂.
@zzavatski
@zzavatski Місяць тому
Especially the false narrative from NVIDIA that when you set resolution to 4K and enable DLSS you still play at 4K.
@hollisbostick2872
@hollisbostick2872 Місяць тому
@@zzavatski Well.... yeah, but this is the fiction *everybody* is supporting, because (apparently) it is impossible to make hardware that will display games with any kind of effects in 4k at a decent speed for a reasonable price. If a 4090 -- agreed by all to be the most powerful hardware currently available, the "best of the best" -- can barely hit 60 fps at 4k Ultra in Alan Wake 2; can't even make 30 fps with RT on; if the only way to get decent framerates is through software 'hacks', *yet* Nvidia _still_ charges €1800 - €2100 for the hardware.... then we have to recognize that for whatever reason, the hardware is beyond its limit and everyone is lying and saying it isn't to get as much money as possible out of the users. Because it's ridiculous to think that I would spend €2000 on hardware that literally cannot do what I'm paying for unless you lied to me *like a **_boss_* .
@XeqtrM1
@XeqtrM1 Місяць тому
Again they do it on purpose because in reality you don't need a super high end GPU to play 4k with high frames actually because it's about game optimization so if you optimize a game well enough you don't need a latest and greatest but they do it on purpose to make you buy the latest GPU so in a sense they work together because if games was so good optimize to run 4k 144 on a 1080ti for example people wouldn't have a reason to upgrade for does that mainly game it's sad but it's the truth​@@hollisbostick2872
@ggogaming7441
@ggogaming7441 Місяць тому
If only you can feel the shiver that went down my spine when you mentioned DLSS. That realisation that kicked my brain to high gear and I was like...Holy timbers, he's right! If you're playing at 4k with DLSS quality, you're actually playing at 1440p.... You're more CPU bound than you thought.
@KryssN1
@KryssN1 Місяць тому
What is an obvious thing to many to others it’s not 🎉
@aberkae
@aberkae Місяць тому
I've been saying this for years 😂.
@beemrmem3
@beemrmem3 Місяць тому
Not to mention there is almost no reason to not run DLSS quality at 4K. It's where it shines the most
@Angel7black
@Angel7black Місяць тому
True, but youre good with a 4090 with even a 12600K or 5800X3D if you were gonna use upscaling. Its why with a 12700K im not really caring what new CPUs come out and havent cared about anything newer. Im more than fine with my 4070 Ti at 1440p, and id be fine with anything rivaling a 4090 in the future cause id realistically be moving to 4K in that case. Its complicating. If i were gonna buy a 4090 level GPU for 1440p in the future cause thats just how much harder it is to run 1440p max settings, id honestly just move to a used 14700K or 13700K, overclock and time the ram and be fine probably. Not planning on buying a new CPU+motherboard combo realistically till AM6 and whatever Intel has to over at the time.
@Relex_92
@Relex_92 Місяць тому
You still got it wrong. At least kind of. Don't think in resolutions. The CPU doesn't care at all about any resolutions (only your GPU does) on the CPU side it is ALL just about FPS. If you have more FPS (on the GPU side) your CPU also has to work hardware and has to keep up. It alwas has to be able to render game logick at the required framerate. Usually GPU has less FPS at higher Resolutions and more FPS at lower resolutions. But if the game is not very gpu demanding you could also have like 1000 FPS @ 4K and you are CPU bound at like 100 FPS. Resolution. Doesn't. Matter. CPU -> Only FPS matter. You will get CPU Limited at a certain framerate in a specific game in a specific scene with specific settings no matter what resolution.
@FilthEffect
@FilthEffect Місяць тому
Going from a 9400 to a 12700k doubled my fps in certain titles at 4k.
@saricubra2867
@saricubra2867 Місяць тому
Is not a fair comparison, the 12700K kinda is a mistake by Intel, it is way too good.
@Dexx1s
@Dexx1s 27 днів тому
@@saricubra2867 Well, whenever you make such a jump in CPU power and then add 'in certain titles', the response is automatically: no shit Sherlock. The 9400 doesn't even have multithreading. It's 6 cores and the same 6 threads. Plugging a Ryzen 1600 into those Starfield charts would probably show the 7800X3D to be miles ahead as well.
@SinNumero_no41
@SinNumero_no41 27 днів тому
​​​? Is not that amazing, the 12600k is way better in that regard since it almost matches the 12700k in games for way less, in fact he should've gone with that one instead since he wasted money. ​@@saricubra2867
@NukaOrQuantum...
@NukaOrQuantum... Місяць тому
Core 2 Duo should pair well with the 50 series.
@nathangamble125
@nathangamble125 Місяць тому
The best CPU to pair with an RTX 5090 will obviously be an Intel 8080.
@Beeterfish
@Beeterfish Місяць тому
Joke aside, that CPU was the greatest I've had. It was a huge leap in terms of x86 architecture and not overly expensive.
@sherr6847
@sherr6847 Місяць тому
i honestly love the way you do your videos. the moving dan to point at things is cute and humorous
@krspy1337
@krspy1337 Місяць тому
real
@102728
@102728 Місяць тому
In short: - The CPU can achieve a mostly fixed framerate across all resolutions - The framerate the GPU can achieve scales inversely with the resolution - This is not set in stone, but dependent on the game and the settings - The lowest between the two is the framerate you'll end up with, what causes generally a GPU bottleneck in higher resolutions and a CPU bottleneck in lower resolutions - This can easily be seen in GPU benchmarks, where you'll see the GPU achieve the same framerates between 1080p and 1440p in some games, or not scale proportionally to the drop in resolution - A third limiter is the game engine, which in some cases, mostly older engines, can't achieve higher framerates than a certain cap - Upscalers like DLSS, FSR and XeSS render at lower resolutions than the target resolution, providing close to the framerate of a lower resolution - This makes it more likely you'll hit a CPU bottleneck when using upscaling even at high monitor resolutions - BONUS: Geforce cards drop the driver overhead on the CPU, while Radeon cards keep them on the GPU. This means that with lower end CPU's, there is a significant difference in performance between Geforce GPU's and Radeon GPU's since the CPU bottleneck is more severe when paired with a Geforce card due to that driver overhead
@slimeman2374
@slimeman2374 Місяць тому
TY bro
@klauserwin9860
@klauserwin9860 Місяць тому
I heard that the NVIDIA GPU driver offloads all the work to only a single CPU core, while the AMD GPU driver can distribute the driver workload to multiple CPU cores. Thus, you especially need a high single core performance CPU when you use a NVIDIA GPU. Someone please confirm, what I said.
@artiromanenko
@artiromanenko Місяць тому
- Remember, if you have 60 hz DISPLAY, then DISPLAY is your 60 fps bottleneck, no matter how high fps your CPU and GPU can produce.
@devonmoreau
@devonmoreau Місяць тому
excellent post! 🎉
@mecyanned
@mecyanned 27 днів тому
Very short 😂😂😂😂😂😂😂
@FromDesertTown
@FromDesertTown 28 днів тому
Just think of it as two different pipelines: CPU instructions and GPU instructions. Depending on the rendering engine and how the game was programmed, the balance of instructions sent to the CPU and GPU varies. For most games, changing resolution will mostly impact GPU load.
@arc00ta
@arc00ta Місяць тому
Great video, as a hardware enthusiast I've known about this for many years but its really hard to articulate it as well as you did here. Going to bookmark this one for future use so I don't have to try and explain it.
@WrexBF
@WrexBF Місяць тому
When a game is very CPU intensive, that's applies to any resolution. Higher resolutions have close to no impact on CPU performance. I said ''close to no impact'' because as the resolution increases, the framebuffer size increases, and that requires more memory bandwidth for transferring data between the CPU and VRAM. That increased data transfer can ''technically'' lead to slightly higher CPU usage. However, that increase is generally not significant compared to the overall workload of the GPU, which handles the bulk of the rendering process.
@christophermullins7163
@christophermullins7163 Місяць тому
Applies more so to lower resolutions***
@meurer13daniel
@meurer13daniel Місяць тому
​@@christophermullins7163not at all. There is no difference in resolution. If you are playing at 60 fps, vou requirement will be the same regardless of resolution. People just say stupid things like this because they think "lower res = more fps = more CPU demand", but they fail to realize the CPU requirement increases due to higher fps, not lower resolution. They are not the same thing
@saricubra2867
@saricubra2867 Місяць тому
@@meurer13danielHigher FPS means that the CPU has to handle a shorter game cycle therefore processing more inputs and outputs more quickly.
@christophermullins7163
@christophermullins7163 Місяць тому
@@saricubra2867 I said that "CPU matters less at 4k than lower resolutions" and after watching this video, Daniel confirmed that my statement is accurate. The title is clickbaity AF. The point of the video is to say that when we say "play at 4k" we are more likely upscaling but that does not invalidate the original statement at all. "Slower CPUs have less of a performance hit at 4k" "CPU performance matters less at higher resolution" are irrefutably factual statements and Daniel himself knowd that without any doubt in my mind. It is all about the way we word the question and assuming we are using upscaling and lower graphics settings etc. it cannot be argued that anything I said is wrong. You'd be greatly benefited from trying to understand why that is the case if you still haven't realized that it is true. Go ahead and argue something irrelevant like Daniels video.. no one said anything about upscaling guys.. it's called clickbait.
@Shadowninja1200
@Shadowninja1200 28 днів тому
@@christophermullins7163 But you are making a completely different statement. He didn't confirm your statement. This entire video is digging into the nuance of why cpu performance *looks* (emphasis mine) like it doesn't matter at 4k but it actually does. The issue isn't that slower cpus doesn't matter at 4k but more so that you're gpu limited if you just slap on max setting and called it a day. It will matter if you adjust down the settings and use up-scaling to reach your potential max framerate which is determined by what cpu you have. Most people aren't playing at max setting on 4k. It's like you heard the phrase "you're kind of right" and just stopped watching after that. You can argue all you want about how you're right but you actually aren't because you're making a statement based on a faulty conclusion. The data from the video above us literally tells us that you can be cpu limited at 4k and it does matter what cpu you have especially if you're reducing your gpu bottleneck by paring down the settings and resolution. No one in their right mind would recommend pairing a 4080 with something like a 2600x "because cpu doesn't matter at 4k" because we all know that it does bottleneck.
@MrXaniss
@MrXaniss 26 днів тому
Well i have a 4090 and went from a 5800x3D to a 7800x3D, my average fps stayed the same at 4k pretty much, but it was MUCH more stable.
@Cheeseypoofs85
@Cheeseypoofs85 Місяць тому
an outlier to the 4k argument is MMOs.... in MMOs, where a bunch of players are around each other, cpu definitely matters.... 3d chips really shine in those scenarios. my minimum fps in wow in valdrakken in 4k doubled when i switched from a 3700x to a 5800x3d. i get 80fps in 4k native max settings in city.
@Billyshatner88
@Billyshatner88 Місяць тому
U should fine tune that even more with 14600k and a 7900xtx i get 150 to 180 fps in vald at 4k but the thing with vald is it artificialy limits gpu and cpu usage im always sitting at like 60% gpu and like 20% cpu usage in the emerald dream i get 90% + gpu usage and like 400+ fps at 4k
@infiniteblaz3416
@infiniteblaz3416 Місяць тому
@@Billyshatner88Just get a 7800X3D which would blow both of them out of the water. Lol. In all seriousness, if the hardware you’re playing on is serving you well then no need to change it. The PC community STRUGGLES with this notion and continues to complain why they’re in debt.
@khaledassaf6356
@khaledassaf6356 Місяць тому
Yeah, had a similar experience from the exact same upgrade in flight sim titles.
@Krannski
@Krannski Місяць тому
​@@Billyshatner88 psst... You're CPU bound. That's what being CPU bound looks like. If your framerate isn't hitting max and your GPU utilisation isn't at 100% that means you're CPU bound.
@kerotomas1
@kerotomas1 Місяць тому
Yes CPU performance is all about raw data. Especially in multiplayer where you have to calculate 15-20 players like in a WoW raid that tanks the CPU hard while in a singleplayer game everything is predetermined the CPU doesn't have to do anything there.
@androidx99
@androidx99 29 днів тому
Awesome content as always @Daniel Owen. I build alot of PCs and haven't considered this perspective, so thank you for bringing it to my attention!!
@jesusfreakster101
@jesusfreakster101 Місяць тому
Thank you for this explanation- I am one of many who struggle with settings and tweaks - more education is needed on topics like this.
@Jason-ol1ty
@Jason-ol1ty Місяць тому
it all depends on how much RGB you have
@brunoutechkaheeros1182
@brunoutechkaheeros1182 Місяць тому
none, which means my 5800x3d is dogshit
@whitygoose
@whitygoose Місяць тому
​@@brunoutechkaheeros1182 put RGB on your cooler.
@sanzyboy3952
@sanzyboy3952 Місяць тому
Remember if you are gpu bottlenecked you can always just use upscalers and lower graphic settings. The same doesn't apply to a cpu bottleneck
@christophermullins7163
@christophermullins7163 26 днів тому
I decrease the CPU render resolution to put more of the resolution on the GPU instead.
@DarkSoul-pb6dv
@DarkSoul-pb6dv 26 днів тому
@@christophermullins7163 are you talking about draw distance? never heard of cpu render resolution before draw distance helps a lot for cpu and gpu but you will still get lag spikes a lot because of the pop in from draw distance
@obeliskt1024
@obeliskt1024 Місяць тому
very informative and practical. I generally recommend to my friends that CPU doesn't have to be the best for their build, but I always ask them what their resolution is, their target "smoothness" (most people are not familiar with fps) and the games they wanna play. So most of the time I would recommend a mid-range CPU like a ryzen 5 or i5 + a decent midrange GPU for 1080/1440p. Sometimes I'd even sell them to the upscaling tech available for each side as well as show them that some games have beautiful raytracing BUT I always make sure to let them know that it's not always a good idea to crank up the settings and make them consider optimizing settings.
@vladislavkaras491
@vladislavkaras491 Місяць тому
Thanks for the great explanation!
@hyperfocal2002
@hyperfocal2002 Місяць тому
It's great to see a selection of videos explaining the connection between CPU/GPU and framerate at different resolutions come out in the past few days. It works to test CPUs with nothing but a 4090 for comparison and likewise GPUs with a 7800X3D for consistent benchmarks, it doesn't help people trying to match components in an upgrade or on a strict budget when the midrange CPU wastes GPU power and vice versa.
@joeljohnson3490
@joeljohnson3490 Місяць тому
Excellent video. Great insights.
@jasonroberts8416
@jasonroberts8416 Місяць тому
Thanks for doing this I've seen way to many "CPU doesnt matter at 4K" comments over the past few years, even at 1080P I noticed a big improvement in 1% lows alone at 1080P when upgrading from an i7 8700K to 13600K (Carried over existing RTX 3070).
@DarkSoul-pb6dv
@DarkSoul-pb6dv 26 днів тому
i'm still on a i7 8700k running at 4.8ghz with a 3060ti and i have people telling me all the time my cpu is still fine for gaming i play battlefield games all the time and helldivers 2 and it's always my cpu even when i play metro exodus this cpu is just to old now i cannot stay at 60 fps
@jaquestraw1
@jaquestraw1 Місяць тому
Very important vid Daniel and well explained👍
@niko220491
@niko220491 Місяць тому
Nice video.. again. :) I can absolutely recommend the videos from Gamers Nexus regarding "GPU Busy", "Animation Error", "input lag" (Latency pipeline), "Optimization" and how GPU drivers work.. discussed with an engineers from Intel and Nvidia.
@theinsfrijonds
@theinsfrijonds Місяць тому
I’m glad that this isn’t in response to my post on one of these videos! 😅 I did say that CPU doesn’t matter at 4K resolution but what I really meant was the difference between CPU’s of a particular segment. I was considering a 14900K at one point and there’s the 7800x3D, but I went with a 7950x3D. The graphs for those can show different frame rates but as you mentioned it’s highly dependent on the area that you’re in in the game. Also, I doubt that I would actually be able to appreciate any kind of uplift that’s less than 5 or 10%.
@michaelcarson8375
@michaelcarson8375 Місяць тому
@theinsfrijonds Why not upgrade to a CPU that has more than dual channel memory? When I made the switch I noticed the difference in frame pacing, but I was also able to truly multi task. It amazes me that streamers don't use workstations to game with. Yes there's a limit of what a cpu can do at 4K, but no one has put the Quad, hex, and Oct memory channel systems to the test in a public video. The cool thing is you can work and GAME at the same time with a workstation. You have enough pci-e lanes that you can run two VM at the same time and use more than one GPU.. Gaming is limited to single GPU since SLI and Xfire bit the dust. if you have another GPU you can use VT-d and pass it through a VM to run rendering., professional workloads, or a streaming for youtube/twitch/whatever box on the same system. The use of unlocked threadripper and unlocked Xeon is not well know by those that focus only on gaming.
@saricubra2867
@saricubra2867 Місяць тому
@@michaelcarson8375You save a lot more money buying an X3D cheap instead of buying an overpriced platform with quad channel support. The extra cache is equivalent to a giant memory OC.
@theinsfrijonds
@theinsfrijonds Місяць тому
@@michaelcarson8375 Those processors were out of my budget. I mostly bought AMD because it's more power efficient. I do hate the fact that the memory controller in their processors is definitely worse than Intel (limited to 3000 mega transfers per channel.) Also I'm not sure that there is a motherboard where I could access the second PCI Express slot with my 4090 graphics card. It's just that huge (four physical slots taken up in space but three on the back of the case.) Good to know though that there are uses for quad channel and up. I can only imagine that channels haven't covered that due to the limited number of buyers plus the fact that it would be very expensive for them to review
@Shadowninja1200
@Shadowninja1200 28 днів тому
@@michaelcarson8375 Because workstation cpu are focused more on doing a bunch of things well rather than having high clock speed which some games really need. You could potentially limit yourself on performance if you go for something like a threadripper or a xeon cpu. Also the price difference doesn't really make sense for a streamer when they could put that money into second pc that they use for capture and rendering, keeping a eye on chat, and so on. They cut out the overhead on streaming without sacrificing performance this way. *edit* also using a vm while gaming could potentially get you banned if you play online if you can even launch the game at all due to anti cheat detection. A lot of cheaters used to use vms to run their cheats.
@michaelcarson8375
@michaelcarson8375 28 днів тому
@@Shadowninja1200 Excuses excuses. There are unlocked CPUs from both Intel and AMD Those systems are the true high end desktops. Dual channel cpus no matter the number of cores are not high end desktops.
@Makavelli2127
@Makavelli2127 Місяць тому
thank you for making this video, so annoying seeing so many people assume you can pair a 4090 with a 5600 and expect the same results as a 7800x3d at 1440p+ 🤣
@ZackSNetwork
@ZackSNetwork Місяць тому
I saw a person pair a zen 1 ryzen 5 1600x with a RX 7900xtx. They argued there is no bottleneck.
@ezechieldzimeyor4541
@ezechieldzimeyor4541 Місяць тому
​@@ZackSNetworkwow that is insane. I can understand a 5800x3d or something but a 1600 isn't even going to have a PCIE 4 unless you go get a new motherboard that costs 3x the CPU 💀
@aboveaveragebayleaf9216
@aboveaveragebayleaf9216 Місяць тому
Why would you drop $1,000 on a gpu and not at least get a 5600 or something for $130 more lol. Even if there is "no bottleneck"
@chy.0190
@chy.0190 Місяць тому
They are the people crying why their framerates are the same on both low and max settings in certain games, and then blame it on bad optimisation lmao
@V4zz33
@V4zz33 Місяць тому
People think that?
@neinnein3972
@neinnein3972 Місяць тому
This came exactly at the right time. Thank you!
@arseniogarcia8631
@arseniogarcia8631 Місяць тому
Mainly game at 4k besides esports titles and just upgraded from the 5600x to the 5800x3d and it's been a big uplift. Average is only 10% to 20% higher but the 1%low is so much higher its great. Even if GPU is maxed out it still helps a lot!
@ChoppyChof
@ChoppyChof Місяць тому
Did the same change and got the same results as you, those better 1% lows make such a difference. Really wanted a 7800x3d, but factor in the price of the new ram and MB and it was 3-4 times the cost of just dropping in the 5800x3d. Shall stick with this for a while.
@pengu6335
@pengu6335 26 днів тому
​@@ChoppyChofYeah, the 1% lows matter just as much. Most people think average fps is all that matters.
@harlanmechling
@harlanmechling Місяць тому
Another great video my man. You remain one of the best UKpostsrs in this space.
@Torso6131
@Torso6131 Місяць тому
My favorite case of this is the original Crysis (and remastered) where on top of being super GPU heavy, it also increases more CPU-driven rendering budgets with an increase in resolution. On top of that it's also not very well multi-threaded so you basically need a 7800x3D to run a 2007 (right?) game at 4k60fps maxed out settings. I think it ends up pushing out the LOD more or something super weird the higher your resolution, so you get more CPU calls. Digital Foundry did a pretty good video on it and in the clip where they got Alex (their PC focused member) a 7800x3D to replace his 12900k they discuss Crysis specifically. But yeah, basically at 4k your CPU is less likely to be a bottleneck, but it still very much can be, and you want something at least pretty good to make sure you don't get slammed with crap frametime spikes, as CPU limited gameplay will almost always be a stuttery mess where GPU limited will be better.
@jhowle9475
@jhowle9475 Місяць тому
0:01 the face you are making while pointing to what looking like a red dog rocket covering a name is priceless.
@kr00tman
@kr00tman Місяць тому
Great video...I actually changed my opinion on this a while back. In CP77 with a 5950x and a 4090 with rt max, dlss off in 4k I was getting about 40 fps, when I upgraded to a 7800x3d it uped to about 48 fps so there was a 20% increase. I was actually pretty shocked.
@johndoh5182
@johndoh5182 Місяць тому
I got to about 10:00 and you were going on and on, and maybe you made this point later? I tuned out. The bottom like for ANY resolution is what a person wants for fps. If I'm playing Starfield with a 4080, well, I have a 2K rig so it's great, or good enough. I prefer an avg. above 90 and a 1% low to be above 60 fps. I find that gives me a stutter free experience in open world games. But looking at a UKpostsr benchmark it doesn't tell me much, because they will stick to certain settings just for consistency, but there's no way I'm playing Starfield at 4K with a 4080 UNLESS I scale down the quality to where once again my avg. fps is above 90 and my 1% is above 60, or I'm using DLSS, and using RT, DLSS is almost always going to be enabled. YES, because I'm pushing the fps back up, the CPU is going to matter. I'm not gaming at the settings HDWU had for 4K in Starfield. I'm not going to let a GPU struggle like that and give stutters from time to time. AND THIS is the point most people get wrong when they look at UKposts benchmarks and listen to a person say "at 4K they're the same". And frankly, part of this held belief has been SPREAD by different UKpostsrs. And I don't need to get into what the CPU does and what the GPU does. What I can tell you is when you play a game at high fps at 4K the CPU is CERTAINLY going to matter. I do know the CPU has to pass data to the GPU before the GPU can render a different frame, otherwise it's painting the same frame, because the GPU doesn't know movement, so there is change data that the CPU has to send to the GPU to get a different frame. The CPU tracks the game. You also have to keep updating RAM. And if YOU personally like playing at 4K with 1% lows at 40 fps, well have fun with it.
@Roland_Deschain_of_Gilead19
@Roland_Deschain_of_Gilead19 Місяць тому
@danielowentech breaks out the teacher side of him! Always great to hear your explanations! 👍🏻
@ShaneH5150
@ShaneH5150 Місяць тому
This does make sense so I want to say thank you for sharing the knowledge! My biggest pc bottleneck currently is my checking balance
@rodriguezkev
@rodriguezkev Місяць тому
So basically as nearly everyone plays games at upscaled 4k instead of native, the CPU is just as important as playing at a native lower resolution.
@tourmaline07
@tourmaline07 Місяць тому
As someone with an 8700k and 4080 Super combo , CPU performance does absolutely matter. I'm even CPU bound in Time Spy of all things 😂 . Bit imbalanced but my old 2080ti GPU died recently , so put this GPU into my old build. Planning to upgrade to Zen 5 though soon enough ;)
@Marco-bq3wc
@Marco-bq3wc 25 днів тому
One more aspect where the CPU is relevant for 4K / increasing frame-rates that Digital Foundry found (mentioned in passing in their Dragons Dogma 2 Tech Review - 19:00 - 19:27) : When CPU limited, the frame-rate in the same scene is still taking a (small) hit (~7FPS difference between 1080p and 4k as well as ~2 FPS between 1080p and 1440p) at higher resolutions. Though they didn't really test this in detail (as this was not relevant for this review) and just mentioned that they suspect this is due to this / some game(s) drawing more things in the distance while at higher resolution.
@nikdi2001
@nikdi2001 Місяць тому
you made it very clear
@teddyholiday8038
@teddyholiday8038 Місяць тому
Of course it matters. Better cpu can provide smoother frame times
@mickieg1994
@mickieg1994 Місяць тому
Will also handle background tasks better, plus more and more games can now take advantage of 12 or more cores, why hamstring yourself with 8?
@paul2609
@paul2609 Місяць тому
@@mickieg1994 Because budget?
@teddyholiday8038
@teddyholiday8038 Місяць тому
@@mickieg1994 only reason why I would go with 8 is if I’m hellbent on a x800x3D chip, but otherwise yeah the more cores the better
@mickieg1994
@mickieg1994 Місяць тому
@@paul2609 I disagree, if the budget is that tight that you can't afford the extra £100 or so to go from 6 to 8-12, then you should seriously reconsider whether you can afford a pc at all.
@saricubra2867
@saricubra2867 Місяць тому
@@teddyholiday8038It depends, for example, the 5900X has 12 big cores and the 12700K is hybrid. The only reason why i choose the 12700K besides the better IPC is the lack of CPU overhead (Intel thread director cleans interruptions for the main threads for any program). I wanted a X800X3D chip but they are 8 core only and they would feel like a slog when the scheduling isn't as good. Ryzen 9 5900X gets lower averages than these newer chips but 1% lows and frametimes are very, very smooth when the game is well optimized (like Cyberpunk 2077).
@muppetpoppet216
@muppetpoppet216 Місяць тому
Great explanation
@Bezzerkus
@Bezzerkus Місяць тому
Thank you for addressing this! I would see so many comments just like the ones you were talking about. Hard to argue with misinformation
@V4zz33
@V4zz33 Місяць тому
Well done, Sir!
@KRawatXP2003
@KRawatXP2003 Місяць тому
Wow I never thought like this before. Very interesting.
@rayanmalik5744
@rayanmalik5744 Місяць тому
spider man remastered with ray tracing enabled is the perfect example of this. it's still very cpu influenced even at 4k.
@gavink7194
@gavink7194 27 днів тому
You're 100% on it man, I use a 7900xtx and a 5800x and play in 4k native, Ive noticed on Cyberpunk that my 5800x bottlenecks the 7900xtx when a lot is going on like in combat situations and what not. I noticed it just like how you are saying, my 1% lows tank down to the 30s while my regular fps is still in the 80s. I think the new AI for the police and NPC's has a lot to do with it, especially when you turn the crowd density to low. I usually have to play with crowd density on medium. But even watching benchmark videos of a 7800x3d with a 7900xtx the 1% lows are still just as bad, which has made me hesitant on upgrading my cpu
@Stinger2578
@Stinger2578 Місяць тому
My last PC was put together around 2011 based around a Phenom II X6 1100T - 6 core CPU paired with a GTX 560 later upgraded to a 980. I could play something like Just Cause 3 at 1080p at decent settings at higher than console framerate between 40 and 60. Some time later, I ended up with a GTX 1080 but not that much improvement. By the end of 2018, I built a new machine based around an i7 8700K and with the same GTX 1080, Just Cause 3's framerate was much closer to solid 120fps at the same 1080p settings. So new platform new CPU I doubled / basically tripled the performance. As-of now I've replaced the GTX 1080 with a 4070 Super card and I can get that same 120fps but also running 2160P - also called 4K.
@proudyy
@proudyy Місяць тому
I just wanted to say thank you for the detailed and informative content. Now I don't mind watching 15+ min videos at all if it's informative. Besides that it's some content which is worth to give a try before going to bed 😂☝👍 Thanks for your effort! 🤝
@erictayet
@erictayet Місяць тому
This is why I change GPU once every 2.5 years but CPU every 5-6 years. My monitor and the game I'm playing are actually the key reasons that cause me to upgrade my PC components. You have to find out what your CPU/GPU are actually doing before you upgrade. There are plenty of tools like MSI Afterburner, AMD Adrenaline Software monitoring, Intel PresentMon, Task Manager, etc.
@dianaalyssa8726
@dianaalyssa8726 Місяць тому
Great topic.
@svendtang5432
@svendtang5432 Місяць тому
Great point on up scaling… 🎉
@SingularitySource
@SingularitySource Місяць тому
would love to see someone do a video of CPU comparisons but at different DLSS/FSR presets to see the performance difference between internal resolutions upscaled by the GPU VS internal resolutions matched to displayed resolution. Most might think it's 1:1 but the extra processing done by the GPU may affect the overall performance in-game.
@ronny332
@ronny332 Місяць тому
The point in (my own) short words: for 4K gaming CPU performance is often not important, just because the framerates are lower, just because the gpu is maxed out. that's all. when the framerates raise, the cpu has to work again a lot more. to be futureproof, an unlimited fast CPU and GPU is needed %-). Nobody really knows, which new game will "waste" more power on the gpu or cpu. Too much just does not exist, but saving the money for a top level cpu but inserting a 4090 is just wrong. Especially with UE5engine using most of the time Raytracing on the CPU we all can find us in a state, where a faster cpu could be easily be needed. That all said with the need for intense AI npcs, just because ... why not?
@Drumlinerolla
@Drumlinerolla Місяць тому
When would it be a good time to upgrade the cpu? I have a 13900k, just thinking ahead, I feel like I shouldn't have to upgrade until 16th gen intel. Still learning about PC and parts, love the videos! This one helped a lot!
@walter274
@walter274 29 днів тому
Good points! In many cases, you feel the difference between CPUs in 1% and.1% low. Also, most people don't play on a clean test bench, so in reality, the CPU is probably going to be hit with random request whie gaming. The better your CPU, the better it can cope with stuff like that.
@fordio1979
@fordio1979 Місяць тому
Really interesting video, been waiting for one of these. I game at 4k with a 4070 super and a 5600x, always been in my mind to upgrade the cpu but I get such good performance I have not chosen to. My gpu is always around 98% ish and cpu mid 70%, I run DLLS quality in every game max settings and get 100-120 fps. I’m in heaven and just don’t see a 58003xd making my experience better, if I did I’d just buy one
@British_Dragon-Simulations
@British_Dragon-Simulations 27 днів тому
Fly at low altitude in Microsoft Flight Simulator or DCS with a i9-12900K and an RTX 4090 and you will see your frames sink below 30fps in 4K. This also happens in Racing Sims when you have a full grid of cars. I find it hard enough to maintain just 72fps in a Pimax Crystal VR headset in DCS and MSFS2020.
@facegamefps6725
@facegamefps6725 Місяць тому
Good job Daniel! I brute forced Dogma so no issues. I recorded it on my channel with 4 pawns in the city.
@zerorequiem42
@zerorequiem42 Місяць тому
"They know just enough to get them in trouble." I feel personally attacked.
@GregA884
@GregA884 Місяць тому
Well explained
@perlichtman1562
@perlichtman1562 Місяць тому
Or to put it another way: when do you start to be GPU limited? One of the best tools for visualizing this is the Forza Horizon 5 benchmark because it shows what percentage of the time it’s limited by one vs. the other and how many FPS the CPU is managing in a given run vs. the GPU.
@str8chillaxin
@str8chillaxin 17 днів тому
Great video
@Iamkaden
@Iamkaden 26 днів тому
I learned about this on my own tinkering with my PC like 6 years ago. A lot of people get it wrong. Heres a simple test you can do as well: if you want to know your highest FPS you can go in any game, first drop to the lowest resolution the game supports, then see what your FPS is uncapped. You can also add an upscale setting such as DLSS to further increase your FPS if you are somehow still GPU bound. 👌
@jonogrimmer6013
@jonogrimmer6013 Місяць тому
Every day is a school day! A lot of people including me often think they know more than they actually do. Being wrong about something isn’t a bad thing as long as you learn from it. Great video
@EliteGamingG786
@EliteGamingG786 Місяць тому
Awesome Video like always Hey one question should I upgrade from Rx 6750XT to 4070 Super? I have 1440 p monitor 144h 32 GB ddr5 Intel core i5 13400F 650w PSU
@TwirlingFern
@TwirlingFern 26 днів тому
Great video.
@c-dub8639
@c-dub8639 27 днів тому
Not to mention that the CPU has a major impact on system snappiness and load times even if you're 100% GPU bound while gaming. I have a TitanXp (1080 TI equivalent) paired with an R7 5800X and I'm itching for a CPU upgrade even though my GPU is 3 years older than my CPU. Additionally, the CPU can get interrupted by background tasks or if the game has to load in assets and that can come through as stuttering even in GPU bound scenarios.
@themalanden
@themalanden 29 днів тому
This actually happened to me. I had a 32 core threadripper 3970x and I upgraded my gpu to a 4090. In particularly Forza Horizon 5 and The Witcher 3, No matter what settings I used the Framerate was about the same. 1080p, 4k, and 4k with any DLSS. All were about the same. Due to this I had to sell my 3970x for a more current cpu.
@Qhimadi
@Qhimadi Місяць тому
once you factor in VR and simracing it's different. i have a reverb g2 , which in theory has 4K res, but having a better cpu would still improve framrate in games like ac competizione and ams2
@LucidThought
@LucidThought Місяць тому
Just paired a 4070Ti Super, with a 10 year old Xeon E5-2695 v3 ('all-core' turbo unlock bios mod) and a 7 year old 1440p G-sync Acer XB270HU and MW3 max setting is 130-140 so a perfectly balanced 'triad'. (*64GB ECC, 1TB NVMe & 1.2TB FusionIO) When the 5090 arrives in a few months, my 7800x3D & 4090 will go to the misses and her (*this) PC will go to my 11 year old mainly for Minecraft RTX :)
@novantha1
@novantha1 20 днів тому
Usually I look at 1% lows when judging a PC build, not the average frame rate. To an extent that’s personal preference, but anyway: I’ve noticed that a lot of higher end CPUs have better 1% lows even given the same average frame rate. Sometimes you also get a boost from faster memory, storage, and higher VRAM GPUs, and those tend to heavily influence my purchasing decisions.
@Nekudza
@Nekudza 26 днів тому
Also graphic settings are well scalable in most of the games while CPU load settings are way more limited. And, most important to me - good CPU will usually provide better 1% and 0.1% lows which are the most important metrics perception-wise
@seebarry4068
@seebarry4068 Місяць тому
My 4790k is doing alright, outside of VR. It doesn't like doing VR but it can, not flawlessly though by any stretch. I think being stuck on ddr3 doesn't help. It pairs ok with my 3070ti, a little overpowered for the CPU perhaps. I can get a solid 60fps on 1200p on everything barring VR. Which is likely the reason I haven't chased a CPU upgrade too hard.
@SB-mr2nk
@SB-mr2nk 22 дні тому
Nice video Daniel. I was finna come tell you you’re wrong just reading the title but then I watched the whole video and it’s like yeah okay that’s fair.
@mrdappernature8861
@mrdappernature8861 26 днів тому
I got a 12700k paired with a 4080 and I am very happy playing at 4k. I get about 90 to 100 plus fps in certain games. It all depends on what type of frames you want and what type of cooling you will have.
@kriet7445
@kriet7445 24 дні тому
YESSSSSSSSSSSS THANK YOU PREACHHHHHH
@ruisaraiva4742
@ruisaraiva4742 Місяць тому
Waiting for new cpus to update my board ram and cpu. Playing 4k with a rtx4090 paired with a ryzen 5700x on AM4 definetely can feel some stuters often in some more heavier games! Still a pleasant experience but cpu upgrade will improve it for sure!
@o4karik41
@o4karik41 24 дні тому
The easiest way to explain how "bottleneck" work's, it's just to understand, that you have two possible fps limiters it's CPU and GPU(if we aren't talking about RAM). They are fairly independent of each other. Both of them can reach a strict amount of frames they can call or draw, that's all. If your CPU can't call more than 60 fps, you simply can't go above that without upgrading CPU itself. The same situation appends GPU cases. I believe it's the best knowledge what I've got about 8 years ago that drove away confusion of building PC theory.
@machoortv
@machoortv 25 днів тому
I recently updated from Ryzen 5 5600x to Ryzen 7 7800XD paired with 3080 because I also play 4K with dlss thinking it will improve the frame rates a lot. But actually it didn’t change anything and it feels like a waste of money.
@garethperks7032
@garethperks7032 Місяць тому
Very nice explanation. Yeah a CPU can only make so many draw calls per second given its workload. Faster CPU means more draw calls per second. The GPU is always waiting on draw calls from the CPU, except for like with frame generation, or if it's under 100% load and has to queue or drop them.
@pengu6335
@pengu6335 26 днів тому
How does it not with frame generation enabled? You still get "more frames" when you're completely GPU bound right before you enable frame generation.
@garethperks7032
@garethperks7032 24 дні тому
@@pengu6335 Yes the extra frames are interpolated on the GPU. They are not based on draw calls from the CPU.
@AbhishekG2000
@AbhishekG2000 Місяць тому
There are a of misunderstandings among gamers about how things works. I also came across some posts that suggest turning on ultra textures even on weak GPU just because you have high vram. According to them it does not affect the performance but it absolutely does. Mid range GPU come with limited memory bandwidth.
@batuhanozkanli
@batuhanozkanli 28 днів тому
Great video! I know this because I have ryzen 5500 and RTX 4070Ti Super :DD
@valentin3186
@valentin3186 Місяць тому
Is DD2 really cpu limited? Or just badly optimized. The industry dropped the ball when the highest of the highest end cpus dip to 20-30s fps
@TheDaoistheway
@TheDaoistheway Місяць тому
I went with i7 7700k with 7900xt. Sure I can play all games on high settings. But it’s not smooth. 1% low is noticeable, tearing and stutter. Upgraded to 14600k. Frame rate went up like 20-40%. And feels much smoother
@theMetallico666
@theMetallico666 28 днів тому
I dont understand fully but I remember I had an i5 7500 with a 1060 6G. When I got an i7 10700k, my 1060 felt like a phoenix, being able to play more games and getting better frames and stability. Afterwards got my current 3070 being able to play smoothly now in 2k for a while. If Im upgrading, i will go for the CPU first once again LOL. (Aiming for an i9 13th or i7 13th if I see a good sale going on)
@fups1
@fups1 22 дні тому
Racing simulators like iRacing/Assetto also tend to be very CPU limited, even with triple screens
@skellig5867
@skellig5867 Місяць тому
My gaming system pairs a Ryzen 5800x3d, 32 gigs of ddr4 memory running at 3600mt CL16 and an RTX4090. I'm driving a 4k oled at 120fps.
@saricubra2867
@saricubra2867 Місяць тому
I game on Counter Strike 2 at 80fps with a high end CRT from 2001 (that can handle pseudo 1440p at 4:3) on an i7-4700MQ Haswell laptop from 11 years ago. OLEDs motion clarity are still terrible, having VGA is still a blessing and my CRT doesn't produce aliasing (phosphor dots >>> square pixels). If i could find the most powerful graphics card with VGA for my current tower PC, i would use that monitor and play everything at native pseudo 1440p (1856x1344).
@skellig5867
@skellig5867 Місяць тому
@@saricubra2867 cool.
@felderup
@felderup Місяць тому
could say, it doesn't scale linearly. some things will be the same load on cpu regardless of resolution.
@gloth
@gloth 21 день тому
it's all about understanding where (and what) the bottleneck is. Increasing gpu load either with resolution or higher settings to the point that the gpu becomes the slowest component of your system, at that point cpu does not matter. That point is obviously different on an 7800x3d vs an 5600, on an 3090 vs a 1060. And of course that point is not a global value, but depends on how a specific game utilizes cpu and gpu (or ram and vram or even drive speed). I usually find the opposite to be true, people seeing cpu gaming benchmarks (correctly done at the lower res with lowest graphic settings) and thinking that the x cpu is going to give them that much fps gains in 4k rt ultra, when they would gain a lot more with a gpu upgrade.
@Jackjack1978.
@Jackjack1978. Місяць тому
That was literally the best explanation of how CPUs affect and don't affect performance I've ever heard. I don't know why all the big UKpostsrs cannot break it down simply like that So the people can actually understand.
@Janjao1984
@Janjao1984 Місяць тому
you totally right
@spicynoodle7419
@spicynoodle7419 Місяць тому
Using your face camera and finger as a cursor is funny and great. Please do it more often :D
@jaredangell5017
@jaredangell5017 Місяць тому
I paired a Xeon 2680v4 with an rx6800. I play everything at max settings in ultrawide 1440p. Getting 45 FPS in Cyberpunk with FSR 2, 25 FPS without it. Getting 35 FPS in Immortals of Aveum without any upscaling. Getting 40 FPS in Forbidden West without upscaling. I think I'll be able to go 5 years before I need to upgrade hardware. Little to no stutters. My build cost $500 and could smoke a PS5+.
@kidmagic9x
@kidmagic9x Місяць тому
In gaming terms: CPU increases Maximum FPS/FPS Cap. GPU increases FPS (but can’t pass FPS cap set by CPU).
@Omega_21XX
@Omega_21XX 29 днів тому
Went from a 6950x @4.2ghz all core, to a 7800x3d build with the same GPU. An RTX3080. The 4K stuff does hold true 9 times out of 10. There are certainly more games today that NEED SSD storage to perform well, and most of those are gonna be the same that are struggling with older CPUs. I personally think for most targeting 4k, the GPU is gonna be the main component still. Outliers like dragons dogma 2 are not the norm. I honestly haven't felt any noticeable difference with my upgrade.
It gets worse...
27:37
Daniel Owen
Переглядів 34 тис.
Top 5 ways you're WASTING money on with your PC!
17:43
JayzTwoCents
Переглядів 964 тис.
Excited Dog Zooms In and Out of Sliding Door!
00:18
The Pet Collective
Переглядів 6 млн
didn't want to let me in #tiktok
00:20
Анастасия Тарасова
Переглядів 1,5 млн
Surprise Gifts #couplegoals
00:21
Jay & Sharon
Переглядів 27 млн
Quit YouTube or Quit Teaching?
31:39
Daniel Owen
Переглядів 95 тис.
Storage Media Life Expectancy: SSDs, HDDs & More!
18:18
ExplainingComputers
Переглядів 293 тис.
I ONLY Had $350 To Build A Gaming Setup...
12:11
The Chandler
Переглядів 628 тис.
Why popular GPUs are so "bad"- and that's OK
9:12
Daniel Owen
Переглядів 36 тис.
I Can’t Believe These are Real - Reacting to Ridiculous PCs on Craigslist
20:53
CPUs Matter for 4K Gaming, More Than You Might Think!
23:28
Hardware Unboxed
Переглядів 120 тис.
PC vs Console in 2024... time to ditch PC?
19:27
JayzTwoCents
Переглядів 569 тис.
Fixing High Latency Issues and Boosting Performance on Intel CPUs
14:13
DannyzReviews
Переглядів 12 тис.
Building the $1200 Sweet Spot Gaming PC that EVERYONE should build!
20:36
Paul's Hardware
Переглядів 347 тис.
Watch BEFORE buying a graphics card!!! BEST GPUs to Buy in April 2024
19:58
#Shorts Good idea for testing to show.
0:17
RAIN Gadgets
Переглядів 810 тис.
Распаковал Xiaomi SU7
0:59
Wylsacom
Переглядів 537 тис.
Нужно ли чистить ПК от пыли?
0:59
CompShop Shorts
Переглядів 64 тис.
Лучший Смартфон До 149 Баксов!!!??? itel s24
20:25
РасПаковка ДваПаковка
Переглядів 35 тис.