Why VRAM Is So Important For Gaming: 4GB vs. 8GB

  Переглядів 145,495

Hardware Unboxed

Hardware Unboxed

День тому

Thermal Grizzly: www.thermal-grizzly.com/en/kr...
Support us on Patreon: / hardwareunboxed
Join us on Floatplane: www.floatplane.com/channel/Ha...
Disclosure: As an Amazon Associate we earn from qualifying purchases. We may also earn a commission on some sales made through other store links
Buy relevant products from Amazon, Newegg and others below:
GeForce RTX 4090 - geni.us/puJry
GeForce RTX 4080 - geni.us/wpg4zl
GeForce RTX 4070 Ti - geni.us/AVijBg
GeForce RTX 4070 - geni.us/8dn6Bt
GeForce RTX 4060 Ti 16GB - geni.us/o5Q0O
GeForce RTX 4060 Ti 8GB - geni.us/YxYYX
GeForce RTX 4060 - geni.us/7QKyyLM
GeForce RTX 3070 - geni.us/Kfso1
GeForce RTX 3060 Ti - geni.us/yqtTGn3
GeForce RTX 3060 - geni.us/MQT2VG
GeForce RTX 3050 - geni.us/fF9YeC
Radeon RX 7900 XTX - geni.us/OKTo
Radeon RX 7900 XT - geni.us/iMi32
Radeon RX 7800 XT - geni.us/Jagv
Radeon RX 7700 XT - geni.us/vzzndOB
Radeon RX 7600 - geni.us/j2BgwXv
Radeon RX 6950 XT - geni.us/nasW
Radeon RX 6800 XT - geni.us/yxrJUJm
Radeon RX 6800 - geni.us/Ps1fpex
Radeon RX 6750 XT - geni.us/53sUN7
Radeon RX 6700 XT - geni.us/3b7PJub
Radeon RX 6650 XT - geni.us/8Awx3
Radeon RX 6600 XT - geni.us/aPMwG
Radeon RX 6600 - geni.us/cCrY
00:00 - Welcome to Hardware Unboxed
00:24 - Ad Spot
01:04 - A brief description of VRAM
06:13 - Baldur’s Gate 3
06:51 - Cyberpunk 2077: Phantom Liberty
07:32 - Dying Light 2: Stay Human
08:01 - Forza Motorsport
08:17 - Immortals of Aveum
09:08 - Ratchet and Clank: Rift Apart
09:51 - Marvel’s Spider-Man Remastered
10:23 - Investigating Texture Presets
10:36 - Assassin’s Creed Mirage [Visual Comparison]
12:51 - Banishers: Ghosts of New Eden
14:02 - Hogwarts Legacy [Visual Comparison]
17:47 - Skull and Bones
18:30 - Star Wars: Jedi Survivor [Visual Comparison]
19:43 - The Last of Us Part I [Visual Comparison]
22:10 - Total War: Warhammer III [Visual Comparison]
23:49 - Final Thoughts
Read this feature on TechSpot: www.techspot.com/article/2815...
Why VRAM’s So Important For Gaming: 4GB vs. 8GB
Disclaimer: Any pricing information shown or mentioned in this video was accurate at the time of video production, and may have since changed
FOLLOW US IN THESE PLACES FOR UPDATES
Twitter - / hardwareunboxed
Facebook - / hardwareunboxed
Instagram - / hardwareunboxed
Outro music by David Vonk/DaJaVo

КОМЕНТАРІ: 1 500
@JarrodsTech
@JarrodsTech Місяць тому
It's a good thing you can always just download more VRAM.
@EyesOfByes
@EyesOfByes Місяць тому
Yeah, I remember when we used to buy RAM on CD but then Napster came along... (I'm also joking)
@JarrodsTech
@JarrodsTech Місяць тому
@@EyesOfByes Now RAM is all subscription!
@Remi_Jansen
@Remi_Jansen Місяць тому
Usually for free too! No idea why people spent money on physical ram, its ridiculous.
@Splarkszter
@Splarkszter Місяць тому
NVidia is worse than Apple
@chadbizeau5997
@chadbizeau5997 Місяць тому
​@@JarrodsTech RAM as a Service!
@AthanImmortal
@AthanImmortal Місяць тому
I never understood why Hardware unboxed caught so much flak originally for suggesting that GPU VRAM was not moving on as fast as the gaming industry was, and that 8GB cards were going to age a lot faster. Instead customers should be angry that Nvidia was selling us the same 8GB of VRAM on the 1070 (2016), 2070 (2018) and 3070 (2020), and STILL wanted $400 for an 8GB card in 2023 in the form of the 4060Ti. Go back 4 years from 2016 and in 2012 you have the 670 with 2GB. Memory *quadrupled* in the same time frame in the mid range. Yet everyone was barking "game devs need to optimise", sure if you want graphics to stay the same, why don't they optimise for 2GB then? Because at some point we need to up and move on. The fact is that someone that bought a 1070 in 2016 can still play at the same level they did 7 years ago, but at the point they run into VRAM issues, those same issues are going to affect someone that bought a 3070 just 3 years ago, regardless of the difference in 3d capability between the cards. I'm glad to see HUB still championing this point.
@lookitsrain9552
@lookitsrain9552 Місяць тому
They bought cards with 8gb of ram for 800 dollars and have to argue about it to make their purchase seem good.
@vmafarah9473
@vmafarah9473 Місяць тому
1070 8gb , 2070 should be 10gb , 3070 sb 12gb, 4070 sb 16gb .in my opinion.
@RobBCactive
@RobBCactive Місяць тому
Ironically I remember HUB doing the opposite when many of us pointed out Nvidia were skimping on VRAM in 2020/21 and valuing the 12/16GB configurations RDNA2 offered. All you needed to do was listen to game devs.
@highlanderknight
@highlanderknight Місяць тому
I agree about NVIDIA not putting enough VRAM on their cards. HOWEVER, we are Not forced to buy those cards. If you buy one, you have little right to complain.
@gamingunboxed5130
@gamingunboxed5130 Місяць тому
​@RobBCactive that didn't happen 😅
@Phil_529
@Phil_529 Місяць тому
First 8GB card was the Sapphire Vapor-X R9 290X in Nov. 2014.
@magikarp2063
@magikarp2063 Місяць тому
Msi had a 6gb version of 280X, a midrange card from 2013.
@lennartj.8072
@lennartj.8072 Місяць тому
Sapphire da goat fr
@Pixel_FX
@Pixel_FX Місяць тому
Sapphire also created the first blow through cooler. Then almost a decade later Jensen made nvidiots believe they invented that with 30 series cooler lmao, it only took some 10 mins of bullshitting with AI, aerodynamics, thermal dynamics yada yada.
@mitsuhh
@mitsuhh Місяць тому
280X was high end, not mid range@@magikarp2063
@tomstech4390
@tomstech4390 Місяць тому
@@magikarp2063 6GB hd7970 (same card) existed before that. There is no architectural change from HD7000 to Rx200 series (except 260x bonaire which added Truaudio) Infactl the shaders were unchanged from hd7000 to Polaris RX 400, even then it was a switch to 2x FP16 instead oif 1xFP32 units, vega added rapid packed math but again not *that much* changed. From Fermi 200 series to RTX20... GCN was amazing.
@807800
@807800 Місяць тому
Any GPU over $300 should have at least 16GB now.
@TTM1895
@TTM1895 Місяць тому
ikr?
@Radek494
@Radek494 Місяць тому
4060 Ti 16 GB should be $350, 8 GB version should not exist and 4060 8 gb should be $250
@MasoMathiou
@MasoMathiou Місяць тому
12Go is acceptable in my opinion, but definitely 16Go above $400.
@N.K.---
@N.K.--- Місяць тому
Even 12gb will make sense
@Eleganttf2
@Eleganttf2 Місяць тому
lol stop being DELUSIONAL wanting a 16GB Vram on a measily 300$ and besides why would they need to put massive 16GB Vram on it, just take a look at how RX 7600 XT,Arc A770 16GB or 4060 Ti 16GB is lol especially at just 300$ where i dont expect the gpu to perform good even at 1080p why would they wanna put 16gb ? for "future proofing" bs ? you do realize that if your gpu is getting weaker especially as its getting older having more vram wont help right ?😂what we need is the right Vram for the right PERFORMANCE SEGMENT but for 350-400$ gpu i would agree that it needs 12gb at bare minimum
@sergiopablo6555
@sergiopablo6555 Місяць тому
I don't know why, but this is the only channel where I click "like" even before the videos start. It may be the absolute lack of clickbait on the titles, how useful all of them are, or how relaxing it is to watch someone speak without yelling or jumping around.
@GankAlpaca
@GankAlpaca Місяць тому
Haha same. Just checked and I already liked the vid instinctively. Maybe it's because the topic is so important to me and generally a thing that a lot of people think about.
@theslimeylimey
@theslimeylimey Місяць тому
Sitting here still happy with my "ancient" 1080TI with 11gig of ram and "only" 484 GB/s memory bandwidth.
@N.K.---
@N.K.--- Місяць тому
Cmon bruh that's pretty legendary for 1080p and works decent on 1440p on any game except for games with horrible optimization
@shagohodds
@shagohodds Місяць тому
That's BS. i own a 1080ti a 3080 and a 4080 the 1080ti is pretty much dead in the water for most recent games at 1080 and certainly at 1440p@@N.K.---
@KnightofAges
@KnightofAges Місяць тому
@@shagohodds You're coping hard for the fact you spend tons of cash on GPUs every gen. Except for Ray and Path Tracing, the 1080ti runs pretty much everything at 50-60fps at 1440p, and is much faster at 1080p. Even in Alan Wake 2, the one game the 1080ti could not run above 30fps due to the mesh shader technology, they are going to put out a patch that will allow it to run the game at around 50-60fps. Now, you're free to spend thousands on a GPU every gen for small gains, but don't try to gaslight owners of the 1080ti, who know very well the resolutions and settings they're gaming at, as well as the fps they get.
@Phil_529
@Phil_529 Місяць тому
@@shagohodds Not really. 1080 Ti is pretty much PS5/Series X power but lacks proper DX12 support. It's still a fine 1440p card if you're using upscaling. Avatar medium settings with FSR quality mode averages 56fps.
@kosmosyche
@kosmosyche Місяць тому
@@Phil_529 Well, depends on what games you play really. And it's not a VRAM-related problem mainly, rather than in general it's getting really really old for modern games. I had GTX 1080 Ti for many many years (too many because of the crypto craze and my refusal to buy a new card for idiotic prices) and I'll be honest, I'll take RTX 3070 8GB over it any day of the week and twice on Sunday, just because it works substantially better with most modern DX12 games, despite the lower amount of memory.
@chriscastillo291
@chriscastillo291 Місяць тому
I love these VRAM videos! No one talks about this or let alone testing on it.. thanks for the vids!❤
@bns6288
@bns6288 Місяць тому
UKposts is full of videos like this.^^ the only rare comparison is 4gb vs 8gb imo. idk why we still talk about 4g while 8gb is already too low.
@zodwraith5745
@zodwraith5745 Місяць тому
Lolwut? There's a ton of videos chasing the vram boogieman.
@tyre1337
@tyre1337 Місяць тому
@@zodwraith5745 tons of vram drama clickbait videos, very few actual deep dive testing like this, the only one i can think of is daniel owen
@dave4148
@dave4148 Місяць тому
No one? Really?
@ElysaraCh
@ElysaraCh Місяць тому
"No one talks about this"? It's been talked about by all the major tech channels since the 30 series launched lmao
@BUDA20
@BUDA20 Місяць тому
also some of those cross the 16 GB of RAM limit because of the low VRAM capacity, so... people with 4GB cards, are likely to have 16GB of ram... so it will tank even more
@kaznika6584
@kaznika6584 Місяць тому
That's a really good point.
@DivorcedStates
@DivorcedStates Місяць тому
How is it that 16gb ram instead of 8 for example is bad for a setup with a 4gb vram card? I dont understand.
@Pasi123
@Pasi123 Місяць тому
@@DivorcedStates Some of the games used more than 16GB of system memory with the 4GB VRAM card so if you only had 16GB RAM you'd see even bigger performance hit. With the 8GB card that wouldn't be a problem because the system memory usage was below 16GB. 8GB single-channel RAM + 4GB VRAM card would be straight from hell.
@Apollo-Computers
@Apollo-Computers Місяць тому
I have 16gb of ram with 24gb of vram.
@gctypo2838
@gctypo2838 Місяць тому
​@@DivorcedStates The point is that if you're using a 4GB VRAM card, the extra VRAM demanded gets "spilled over" into RAM, taking a game that might require 12GB of RAM to requiring 17GB of RAM. If you only have 16GB of RAM, this spills over into swap/pagefile which will tank performance _exponentially_ more. Very few people using a 4GB VRAM card will be running with 32GB of RAM, which makes that 16GB RAM threshold so significant.
@Rexter2k
@Rexter2k Місяць тому
Boy, we are so lucky that current generation gpu's have more vram than the previous gen, right guys? Guys?...
@shadowlemon69
@shadowlemon69 Місяць тому
Double the VRAM, Double the price
@Azhtyhx
@Azhtyhx Місяць тому
I mean, this is not exactly something new. Let's take a look at the Nvidia side of things, going back to the 200 series in 2009, using data related to the '70 and '80 model cards: * Two generations saw a 0% increase in VRAM compared to the previous generation. The GTX 500 Series as well as the RTX 2000 Series. * The '70 has had an average amount of VRAM of 87.5% compared to the '80, in the same generation. * The '80 has had an average increase in VRAM of 38.5% compared to the previous generation. * The '70 has had an average increase in VRAM of 39.3% compared to the previous generation.
@shadowlemon69
@shadowlemon69 Місяць тому
@@Azhtyhx I'm talking about the price increase that came with more VRAM though GTX 480 1.5GB - $500 GTX 580 1.5GB - $500 (590 3GB - $700) GTX 680 2GB - $500 (690 4GB - $1,000) GTX 780 3GB - $650, 780 Ti 6GB - $700 (TITAN 6GB - $1,000) GTX 980 4GB - $550, 980 Ti 6GB - $650 (TITAN X MAXWELL 12GB - $1,000) GTX 1080 8GB - $600, 1080 Ti 11GB - $700 RTX 2080 8GB - $700, 2080 Ti 11GB - $1,000 (TITAN RTX 24GB - $2,500) RTX 3080 10GB - $700, RTX 3080 Ti 12GB - $1,200, 3090 24GB - $1,500 RTX 4080 16GB - $1,200, RTX 4090 24GB - $1,600 Remember that 10-series gave a massive jump in performance compared to the 20-series increased price with insignificant gains. The price hike that started in 2018 wasn't really justifiable. And it kept on increasing after 2 generations. So, it's something new lol.
@upon1772
@upon1772 Місяць тому
​@@shadowlemon69The more you spend, the more you save!
@OGPatriot03
@OGPatriot03 Місяць тому
I had an 8gb GPU since 2014... 8gb is absolutely out of modern spec these days, it would've been like running a 2gb GPU back in 2014 which would've been low end for sure.
@gamingoptimized
@gamingoptimized Місяць тому
Well, I know I made a mistake when I didn't spend 50 dollars more for a GTX 1060 6GB... (I have the 3GB one...)
@Radek494
@Radek494 Місяць тому
Ouch
@kingplunger6033
@kingplunger6033 Місяць тому
yeah, that is like one of the worst gpus ever
@GloriousCoffeeBun
@GloriousCoffeeBun Місяць тому
I bought a 3070 over 6800 just for cuda cores. Guess I am the next 🫠
@baophantom6469
@baophantom6469 Місяць тому
Bruh ..
@PandaMoniumHUN
@PandaMoniumHUN Місяць тому
@@kingplunger6033 Hard disagree, it was a great GPU for it's time in 2016. Sure, even back then it made more sense to buy the 6GB model, but I was using the 3GB variant of that card between 2016-2019 without any issues. Every gaming having 4K and 8K textures only started a few years ago, that's when having way more VRAM was starting to be necessary.
@DJackson747
@DJackson747 Місяць тому
I can't imagine getting any less than 12 GB these days. Granted most of what I play are older games or low spec games but it still let's me play newer software at med/high settings comfortably. A 4GB card should be like $99 usd at this point as it's gonna age like milk.
@erikbritz8095
@erikbritz8095 Місяць тому
should be $75 max for a 4gig gpu in 2024
@ricky4673
@ricky4673 Місяць тому
For me, I cannot do without at least 16. I prefer 24 to 32. Next year, I want 64.
@lorsch.
@lorsch. Місяць тому
@@ricky4673 32GB would be quite nice, really. A little more future proof than 24GB
@mitsuhh
@mitsuhh Місяць тому
you want 64gb vram next year ?@@ricky4673
@sebastian38488
@sebastian38488 Місяць тому
Only 128gb vram
@ojassarup258
@ojassarup258 Місяць тому
Would be interested in seeing 8v12v16 GB VRAM comparisons too!
@vmafarah9473
@vmafarah9473 Місяць тому
As 30 series suck in Vram, developers are forced to reduce game quality to incorporate handicapped 30 series, 12 GB going to be enough for 1440p . Fc Ngreedia. The same company released 6GB 3060 laptops. 8gb 4070 laptops.
@OutOfNameIdeas2
@OutOfNameIdeas2 Місяць тому
It's possible to use 18gb in modern games at 4k. The game will use what it can. If you have less than ideal it will just stutter more and look worse but it won't run out because of Safety features keeping you from running out. My friend had a 3080 and had usage at 8.5gb then got a 7900xtx and saw those same games actually took advantage of more like 16-20gb. More than double the fps.
@c.m.7692
@c.m.7692 Місяць тому
... and Vs 20gb!
@user-mz1if8oe9k
@user-mz1if8oe9k Місяць тому
There is no vcards that can offer 8-12-16Gb variants at the same time. But you can re-watch RTX3070vsRX6800, or seek for videos like "RTX3070-8Gb vs RTX3070-16Gb(modified)"
@zfacersane272
@zfacersane272 Місяць тому
@@vmafarah9473what are you even talking about not true… but i agree with the laptop part
@CharcharoExplorer
@CharcharoExplorer Місяць тому
The R9 390 was not pointless. Modders had a blast with texture mods and LOD/Model mods with it.
@pumcia718
@pumcia718 Місяць тому
Oh I had one of those from Sapphire. The thing didn't care what game I threw at it at the time. Eventually it started thermal throttling and some other thing that I couldn't fix. In early 2019 I got the 16GB Radeaon VII to replace it, dude's still going strong.
@infinitycovuk699
@infinitycovuk699 Місяць тому
had the sapphire 390 nitro was a beast of a card for the price.
@crazylocha2515
@crazylocha2515 Місяць тому
Surprised me how well Steve put things and it kept becoming interesting at every level. Great piece 👍 (Thx Steve)
@Ale-ch7xx
@Ale-ch7xx Місяць тому
@24:29 Correction: The first Radeon card that had 8gb was the Sapphire Vapor-X R9 290X 8gb released on Nov 7th, 2014.
@Hardwareunboxed
@Hardwareunboxed Місяць тому
We're talking about the first official AMD spec.
@guille92h
@guille92h Місяць тому
Shapphire knows how to download Vram😂
@Ale-ch7xx
@Ale-ch7xx Місяць тому
@@Hardwareunboxed The way it was worded I didn't know you were referring to the official AMD spec
@OGPatriot03
@OGPatriot03 Місяць тому
That was an epic GPU
@andersjjensen
@andersjjensen Місяць тому
@@Ale-ch7xx They always are. Cards are set to official spec clocks for core and memory if a Reference or Founders Edition is not available, etc, etc. What individual board partners do will be put into specific model reviews.
@andrexskin
@andrexskin Місяць тому
I guess that we should already look up on a 8GB vs 12GB of "similar cards". An example would be the 3060 12GB vs 3060Ti 8GB, trying to spot if there are already cases where the raster performance from 3060Ti isn't enough to balance the quality a 3060 12GB would achieve with higher texture quality
@TheKims82
@TheKims82 Місяць тому
HUB did test this earlier where the 3060 actually outperformed the 3080 10GB. I believe it was in Hogwart Legacy just when the game came out.
@andrexskin
@andrexskin Місяць тому
@@TheKims82 I think that unoptimized RTX is kinda a niche case. It would be better to extend the tests to more cases
@Pandemonium088
@Pandemonium088 Місяць тому
Next up, 24gb of vram and unreal 5 enginre still has traversal stutter 😅
@tenow
@tenow Місяць тому
recent UE5 titles lack good VRAM settings. Robocop for example always tries to fit into 8 Gb VRAM and as a result always has texture pop-in. Saw it in some other recent titles as well.
@residentCJ
@residentCJ Місяць тому
​@@tenow ithought nanite was the holy grail in ue5 to eliminate texture pop in.
@ROFLPirate_x
@ROFLPirate_x Місяць тому
​@@residentCJonly if you have enough vram to use it. Nanite doesn't stop texture pop in, it better adjusts LOD over distance. I.e the closer you get the more details it renders. If you go past the VRAM buffer, you are still gonna have pop in as your system is forced to use system ram, which is much slower for the GPU to access. Also. Devs have the ability to not use nanite, it is quite resource intensive. So not every dev is gonna implement it.
@OutOfNameIdeas2
@OutOfNameIdeas2 Місяць тому
Because not even 24gb is what you need for a 80 series card to use it's power properly. I'm having trouble playing even beat saber on my 3080 with a index.
@TheCgOrion
@TheCgOrion Місяць тому
No kidding. I have NVME, X3D, 96GB Ram (other work), and 24GB Vram, and they still manage to not have that scenario take advantage. 😂
@SweatyFeetGirl
@SweatyFeetGirl Місяць тому
thats exactly for the people that say "your card will run out of power before you run out of vram". they dont know that texturs dont give you a performance hit and are a free major visual upgrade.
@geraldmoogle8708
@geraldmoogle8708 Місяць тому
Yeah I got tired of even saying it most of the time. I suspect Nvidia is leveraging their AI bots to market this way, only explanation I have for why people are so against getting a quality product at a reasonable price. They actively want to overpay for poor products, but that's the very opposite of my real life experience where people want to pay less for a great product and brag about getting great deals.
@therecoverer2481
@therecoverer2481 Місяць тому
does this stay true with lower-end GPU like RX6600? im currently looking for one at the used market, they go around 150 usd, or should i go with RX6700xt at like 200 usd?
@SweatyFeetGirl
@SweatyFeetGirl Місяць тому
RX 6700XT has 12gb vram compared to the 8gb on the rx 6600 and the 6700xt is 40% faster, so yes, its much better than the rx 6600! you spend 33% more money but you get 40-45% more fps, its absolutely worth it@@therecoverer2481
@stangamer1151
@stangamer1151 Місяць тому
It depends on a certain game. But 4GB vs 8GB is still an extreme case. 4GB cards are either very old or very cheap, so no one should expect them provide great results in modern games anyways. Talking about textures, it is not always the most noticeable visual change. Objects and shadows pop-in are much more crucial IMO. When everything pops out of nowhere, it makes gaming unenjoyable. I'd rather sacrifice texture quality, then draw distance. Also low resolution shadows and low quality AA are very annoying too (constant flickering is awful). So for me it is draw distance first, shadow quality and AA quality next and only then texture quality.
@triadwarfare
@triadwarfare Місяць тому
​@@geraldmoogle8708they probably want their product to last more. If tech moves too fast, your 8gb would be obsolete in a few years rather than having to last it for a console generation
@TheZoenGaming
@TheZoenGaming Місяць тому
I always laugh when someone claims that more than 8GB isn't necessary on a card just because it isn't a high-end card. Not only does this show that they don't understand how VRAM is used, but mods can really crank up the amount of VRAM used by a game.
@Dragoonoar
@Dragoonoar Місяць тому
yea but lets be real here, the only game you'd mod the shit out until it utilizes more than 8 gb of vram is skyrim. there are strong arguments against 8gb vram, but mods aint it
@nimrodery
@nimrodery Місяць тому
Sure, if you tweak your settings you can sometimes find a game where you can run out of VRAM on one GPU while the equivalent GPU with more RAM keeps running fine, but in most cases you'll get bad performance with the higher VRAM GPU as well. Basically lower end cards don't need as much VRAM because they won't see as much benefit. I wouldn't say "no" to extra RAM but I'm not going to spend an extra $150 or $200 on it when buying a low end GPU (not a dealbreaker). For the midrange and up everything should be over 10/12 GB.
@TheZoenGaming
@TheZoenGaming Місяць тому
@@nimrodery I also laugh when people show that they failed to read the comment they are replying to, or, for that matter, that they didn't watch the video that comment is posted to.
@morgueblack
@morgueblack Місяць тому
​@@nimrodery"lower end cards don't need as much VRAM".....Bruh, did you even watch the video? Hardware Unboxed literally used a 6500XT (that's a lower end card, don't you know?) to disprove your point. What are you on about?
@nimrodery
@nimrodery Місяць тому
@@morgueblack No, I was talking about 8 vs 12 or 16, HUB already has a video about 8 vs 16 in the form of the 4060 ti, which shows that for the most part there's no performance gains to be had. For instance you can enable RT and ultra settings on both, and while you may run out of VRAM on one, the other GPU isn't putting out high FPS because of the settings and the fact it's a 4060 ti. No GPU should have less than 8 at this point.
@Code_String
@Code_String Місяць тому
It sucks that some nice GPU chips will be held back by the VRAM buffer x: .The 12Gb of my 6800m has been surprisingly more useful than I could have thought.
@roqeyt3566
@roqeyt3566 Місяць тому
On mobile, that card was awesome for the generation There wasnt a lot of vram to go around in affordable laptops, which the 6800m fixed
@timalk2097
@timalk2097 Місяць тому
@@roqeyt3566 bruh you talk as if it's a decade old gpu, it's still by far one of the best options for laptop gpus atm (especially if you're on a budget)
@pvtcmyers87
@pvtcmyers87 Місяць тому
Thanks for creating the video for this and for taking a different look at the vram issue.
@9r33ks
@9r33ks Місяць тому
Yes, textures makes all the difference. Back in a day of my GTX1080 I'd always try to pump up textures as high as I can, while lowering everything else as much as reasonably practicable. Textures make all the difference.
@viktorianas
@viktorianas Місяць тому
Exactly, I put everything on low and then put high/ultra textures, next goes lighting, effects, then shadows, draw distance and other bells and whistles with ray tracing at the very bottom on priority list.
@9r33ks
@9r33ks Місяць тому
@@viktorianas indeed. I hate when developers try "optimising" and "balancing" graphical presets. Devs always put textures resolution and details way too low on their presets, making the game look like ass, while it could look so much better with some useless particle effects disabled or lowered, and decent textures raised to reasonable level. Vram capacity and its importance really opened my eyes on the subject. I won't let nvidia and AMD mislead me with "great deals" of gpus having reasonable price but low vram.
@geraldmoogle8708
@geraldmoogle8708 Місяць тому
Nvidia doesn't want you to do this. They want you to crank up lighting and shadows while lowering textures. What's insane is also the motion blur modern games stuff in, and then adding the fake frames and filter on top of that pretending they are running at a higher resolution. It's really bad, the reason Nvidia can do this is because of AI marketing and their superior drivers. If AMD or Intel could deliver flawless drivers for games going back 20 years there would be no reason to even look at Nvidia anymore.
@Rapunzel879
@Rapunzel879 Місяць тому
That's incorrect. Resolution makes the biggest difference, followed by textures.
@defeqel6537
@defeqel6537 Місяць тому
@@Rapunzel879 nah, I often rather crank up shadows than resolution
@user-lk5kn2tr7k
@user-lk5kn2tr7k Місяць тому
Great job with testing Steve, thanks for an update on VRAM matter.
@Jamelele
@Jamelele Місяць тому
I have a 3080 10GB, great thing they released a 3080 12GB
@Phil_529
@Phil_529 Місяць тому
10GB was a terrible choice by NVIDIA. They marketed that as a "flagship" but it had less VRAM than the 11GB 1080Ti from years earlier for the same $700.
@mertince2310
@mertince2310 Місяць тому
@@Phil_529 Its a great choice for nvidia* bad for users. If they can get away with less vram and people even defend this choice, why would they put more?
@Phil_529
@Phil_529 Місяць тому
@@mertince2310Well clearly they didn't have to. Ampere was peak crypto and the day I got my 10GB for $700 I could've sold it for $1500 followed by it being sold out for almost 2 years. To be fair the 10GB at 4K was mostly OK for the first 2 years but it fell off a cliff once "next gen" games started to come out. Dead Space remake and RE4 was the final straw for me and I upgraded to a 4090. Pretty sure the lack of VRAM is directly related to NVIDIA wanting to push AI users into more expensive GPUs. At least now they offer a 16GB model for $450 so next generation shouldn't be so outrageous. These overpriced mid ranged cards shouldn't be limited to 12GB and asking for $600+.
@BitZapple
@BitZapple Місяць тому
@@Phil_529 Even with 10GB I personally never actually ran into actual problems even playing games like Hogwarts legacy. 1440p and I guess I just didnt reach that area yet, but even if I would I can always go from Textures Ultra to High (which looks the same), problem solved, There's also even a mod that reduces VRAM usage.
@Phil_529
@Phil_529 Місяць тому
@@BitZappleHogwarts Legacy ate VRAM like no ones business at launch if you were using ray tracing (14.2GB at 1440p). That was another game that choked up my 10GB card. I also had problems with Need for Speed Unbound (easy fix just go from ultra reflections to high and it's fine).
@xkxk7539
@xkxk7539 Місяць тому
some notable past examples are 980 ti (6gb) vs fury x (4gb) and 780 ti (3gb) vs 290x (4gb). vram helped push out more performance for those cards
@Pixel_FX
@Pixel_FX Місяць тому
390X came with 8gb and released same month as 980ti. typical nvidia planned obsolescence
@XxGorillaGodxX
@XxGorillaGodxX Місяць тому
​@@Pixel_FXThat was also with a good price on a 512 bit bus. Those were better days.
@PhAyzoN
@PhAyzoN Місяць тому
@@Pixel_FX I loved my 8GB 290X (essentially identical to a 390X) but let's be real here; the 980Ti was better across the board at everything and for a far longer period of time. That extra 2GB didn't do a damn thing for the 390X.
@OGPatriot03
@OGPatriot03 Місяць тому
@@PhAyzoN Sure, but the 980 ti was considerably newer than the Hawaii architecture. The fact that it was rebranded for so long was thanks to AMD's "fine wine" over the years. Could you imagine if the 290x had that later spec performance all the way back in 2013? It was purely a software holdup all that time.
@defeqel6537
@defeqel6537 Місяць тому
@@PhAyzoN GTX 980 Ti was also about 50% / $200 more expensive
@metallurgico
@metallurgico Місяць тому
My 2014 980Ti has 6GB of RAM... 10 years later games still looks like Half Life 2 and we still have 4-8 GB cards. That's real innovation.
@chillhour6155
@chillhour6155 Місяць тому
Yeah these Unreal 5 games look boring in appealing garbage but everyone's seems to have drinking the coolaid, they REALLY must've like that matrix demo
@mitsuhh
@mitsuhh Місяць тому
Robocop is fun@@chillhour6155
@TheBoostedDoge
@TheBoostedDoge Місяць тому
Actually look worse because devs rely more and more on upscalers instead of actually optimizing their games
@Grandmaster-Kush
@Grandmaster-Kush Місяць тому
Upscaling crutch + TAA + Poor optimization + high development cost + lack of interest in their work / homogenization of coding due to thousand and thousand programmers and developers being churned out in "bideogame skewls" and you have uninspired low risk modern AA and AAA games as a result.
@metallurgico
@metallurgico Місяць тому
@@Grandmaster-Kush that's why I started studying bass guitar instead of gaming.
@MrSmitheroons
@MrSmitheroons Місяць тому
I had been meaning to do some testing like this myself but found it too daunting. You deserve massive props for doing this, and doing the subject justice. The conclusion section at the end going over the timeline of "how we got here", along with the context of the multi-preset, multi-texture-quality and visuals comparisons, it is just *chef's kiss*. So rounded and complete. The only way to give more context would be to show more "negative results" where nothing interesting happened, to contextualize which games it doesn't matter in. But I imagine in 2024 this is getting to be not that many recent games, for one thing, and would arguably bloat an already long video at half an hour. But this was just a really great video, and I really do agree it adds good context for this moment we're in, trying to see how well 8GB will do not just today (we already see it *start* to struggle in certain games), but in a few years down when 12-16+GB cards will be more normalized and devs will want to throw more "bang" into the higher presets to really give gamers something nice to look at in those presets. As you've shown it's not just about visuals, not just about performance, but often both. It either makes no difference, or you're starting to trade quality for performance despite the compute part of the card (actual GPU chip) being fast enough. The question is when it's going to be more relevant, but you make a strong case that it's "starting basically now," and that "the transition will probably be pretty complete within a year or three" to where 8GB will be felt badly as a limit. I know games will tend to be *launchable* with 8GB VRAM, but poorly optimized ports and AAA launches are still a thing, for those that jump on a title on day one (or worse, pre-order)... and you're still going to be leaving some performance *and* visuals or both on the table, that the GPU could have handled otherwise. I think it's high time we understood NVidia and AMD are cheaping out if they give less than 12GB on a multi-hundred-dollar piece of hardware when it's not costing them nearly as much as they're upselling us for 12GB+ VRAM. I don't consider intel to be a trend-setter in this area, but at least they have leaned into their VRAM performance a lot of the time, so I don't suppose they're too egregious, but I do hope they're listening as well. Sorry for the long post, this has been a topic on my mind for some time now. Thanks much again, for all the testing and well considered conclusions. Cheers to you and the team!
@Kelekona_808
@Kelekona_808 Місяць тому
Highlighting the visual differences in the Vram images was very helpful for me. Usually I fail to see the differences between cards when going over visual comparisons.
@Kiyuja
@Kiyuja Місяць тому
I think many people dont realize is that modern games dont just crash with insufficient VRAM but rather dynamically lower assets in order to prevent crashes or stutter. This doesnt mean you dont "need" more. Especially these days where DLSS and RT are more and more important, these techniques store temporal data in VRAM, this scales with your resolution. I consider 12GB to be low end these days.
@menohomo7716
@menohomo7716 Місяць тому
"no you don't understand you NEED to preload 8 gigs of texture in dedicated memory" -the devs probably
@Splarkszter
@Splarkszter Місяць тому
To be fair, textures do the majority of the job on making a game look good. Yes 4K(or more) textures are absolutely dumb. But oh well.
@sush7117
@sush7117 Місяць тому
well, yes. You can see what happens if textures are loaded in RAM instead of VRAM in this video
@Splarkszter
@Splarkszter Місяць тому
​​@@noir-13 Search about the storage space difference. It may seem like a small little number but the growth is exponential. More resolution doesn't necessarely yield higher details. It's an incredibly wide-spread issue. The whole dev market is filled with people that don't know what a 'job well done' is or even means. 2K it's fine i guess, 1K still. 4K or more is just unnecesary because of the exponential storage space consumption(that also well applies to texture loading speed, world loading speed and VRAM consumption)
@DarkNia64
@DarkNia64 Місяць тому
Sounds like someone doesn't care to do their own research ​@noir-13
@mikehawk6918
@mikehawk6918 Місяць тому
@@noir-13 Sounds like you're 12 years old. Also seems like it judging by your profile picture.
@mopanda81
@mopanda81 Місяць тому
Once again doing tests on lower spec hardware at 1080p gives us a lot of perspective that doesn't exist with new card reviews and ranked framerate lists. Thanks so much for this work since it really fleshes out the whole picture.
@axlfrhalo
@axlfrhalo Місяць тому
Halfway through the vid but i love this, gives very good understanding of how VRAM makes a difference and exactly when texture do and dont impact performance and just how drastic it can get.
@teardowndan5364
@teardowndan5364 Місяць тому
The difference between 4GB-to-8GB and 8GB-to-12+GB is that the baseline image quality has a much higher starting point, which makes lowering details to save VRAM and VRAM bandwidth far more palatable with 8GB when shopping in the $150-250 new range today. Anything much over $250 really should have 12GB at a bare minimum.
@vulcan4d
@vulcan4d Місяць тому
Just add sockets on the back of the GPU to upgrade the memory and create a new industry standard of upgradable vram modules.
@kenshirogenjuro873
@kenshirogenjuro873 Місяць тому
I would SO love to see this
@klv8037
@klv8037 Місяць тому
Nvidia's not gonna like this one ☠️☠️🙏🙏
@kajmak64bit76
@kajmak64bit76 14 днів тому
​@@kenshirogenjuro873that is actually possible I saw a video of some dude testing out some modified RX 580 from China it had like 16gb of VRAM ( or was it 32? ) And it's real... it worked and used more VRAM and everything detected the extra VRAM So it's entirely possible to just add more VRAM via soldering but it may vary from card to card
@Starkiller12481
@Starkiller12481 Місяць тому
Great video! Been looking for guidance on VRAM/RAM behavior under different texture loads this was very informative. Thanks gentlemen 🙏🏾
@BaggerPRO
@BaggerPRO Місяць тому
Very informative and descriptive. Thank you!
@Zayran626
@Zayran626 Місяць тому
would love to see a video like this with the 4060 Ti 8 GB / 16 GB cards
@vvhitevvizard_
@vvhitevvizard_ Місяць тому
4060 ti 8GB should not exist at all. and 4060 ti 16GB to be priced $350. well, $400 at max
@Rachit0904
@Rachit0904 Місяць тому
There already is! Look at the 4060 Ti 16GB review
@TheIndulgers
@TheIndulgers Місяць тому
I don't know why people defend nvidia (trillion dollar company btw) for skimping on vram. This same company gave you 8gb for $379 EIGHT years ago with the gtx 1070. People out here coping over their $800 4070ti purchase. 50% more vram for over double the price 3 generations later doesn't sound like progress to me.
@vylrent
@vylrent Місяць тому
These videos are so nice to listen to. Just get a good dose of tech info so I can stay updated (i haven't been in a few months) with a relaxing not-jumpy voice unlike other tech channels. Good job HU team
@Celis.C
@Celis.C Місяць тому
Interesting video idea: statements/stances you've made in the past that you might reconsider now, as well as the why of it (new insights, new developments, etc).
@mxyellow
@mxyellow Місяць тому
I sold my 3070 as soon as I saw HUB's 16GB vs. 8GB VRAM video. Best decision ever.
@Pamani_
@Pamani_ Місяць тому
20:48 I think there is an error here. The perf shown for the 4GB is at high textures while the script says it's low textures
@adnank4458
@adnank4458 Місяць тому
Agree. noticed the same error. don't know if it's the only error.
@justhereforgear3441
@justhereforgear3441 Місяць тому
Thank you for the in depth breakdown. Very informative.
@Hamsandwich5805
@Hamsandwich5805 Місяць тому
I really liked this video, HU! Love the work you do. You ask good questions and seek the tough answers, being Just wanted to add - in some of those games where ram usage creeped over 12GB, you'll be in a really tight spot with most budget builds. Assuming you opted for the 6500XT for budgetary purposes, I'd reckon you likely also only have 8 or 16GB of sys RAM. That's going to make it nearly impossible to rely on sys ram for back up, even if your card's buffer is being fully consumed at 8GB. Windows will still take several GB of RAM, so those games will likely be totally unplayable or run very poorly when loading/switching between apps (the challenges of alt+tab like in the windows XP days). I'd really like to see a system that matches budget direction - maybe a 3600/5600CPU or i3 alternative with 16GB of ram (or maybe even 8GB) and a 6500XT8GB - along side the 'ideal' scenario with the 7800X3D to really compare how these will work for budget builds in years to come. People may not be realizing just how poorly the system will perform forgetting you're testing on a 7800X3D with 32GB+ of RAM. For future technologies, I'm curious how direct storage can help users with tighter budgets take advantage of speedier, low-cost NVME drives, instead of relying on potentially slower sys RAM (what a bizarre statement, considering how slow HDD's were!). It's possible most new games begin to roll out those direct storage technologies alongside 1TB PCI4/5 NVME drives that are relatively fast performance compared to budget GPU buffer speeds, meaning you won't need the upgrades to 16GB as quickly for 1080P gaming. Thanks! Keep up the awesome work!
@coganjr
@coganjr Місяць тому
I love how this community always complaining about more vram while game developer can get away freely to relased unoptimize game. What an awesome community LOL
@Hardwareunboxed
@Hardwareunboxed Місяць тому
Don't see how the two are connected, but okay. It would be very odd to expect modern games to work well on old 4GB graphics cards right?
@coganjr
@coganjr Місяць тому
@@Hardwareunboxed Yes modern game will not running well on 4GBs or even 8GBs of vram. What I mean is when we looked at the recent unoptimized game that need a lot of vram but don't have any graphic fidelity like Alan Wake 2, Cyberpunk 2077
@Hardwareunboxed
@Hardwareunboxed Місяць тому
Alan Wake 2 is pretty heavy on VRAM, especially if you use RT while CP2077 has fairly poor texture quality.
@lencox2x296
@lencox2x296 Місяць тому
@HUB: So lession to learn from the past: In 2024 one should recommend 16GB cards only, or at least 12 GB as the bare minimum for mid-range GPUs.
@Ariane-Bouchard
@Ariane-Bouchard Місяць тому
One thing I wish you'd done more of is zoomed in/slowed down scenes for visual comparisons. Most visual differences I wasn't able to see on my phone, even though I tried zooming in manually. Making things bigger, focusing on the important details, would've probably been a relatively simple way to get around that issue.
@TheHurcano
@TheHurcano Місяць тому
I really appreciate how clear this comparison illustrated the differences. Now I am really curious about performance and quality differences in similar gaming titles when going from 8GB VRAM to 12GB (or 16GB) on the next tier up in cards, especially when comparing 1080P to 1440P along with varied quality settings at both resolutions. Seems like that comparison could end up being a little less obvious on what the "correct" breakpoints of acceptability/desirability are, but might be more relevant to buyers in the $300-$400 market.
@MrHamof
@MrHamof Місяць тому
So what I'm getting from this is that the 6500XT should always have been 8gb and would have been much better received if it was.
@Hardwareunboxed
@Hardwareunboxed Місяць тому
Yes
@TrueThanny
@TrueThanny Місяць тому
Yes and no. Yes, it would have been better with 8GB. No, it would not have been better received, because it was a pandemic card, created explicitly to reach the $200 point in a time of drastic supply line disruption. Doubling the VRAM would have notably increased its price at the time, and it would still have received negative reviews on that basis, even from HUB, which ignored the effect of the pandemic on pricing of all cards for some bizarre reason. Specifically, they compared AMD's MSRP values for cards released in the midst of the pandemic, which accounted for supply line disruption, to nVidia's fictional MSRP values for cards released before the supply line disruption. The 6500 XT was only ever supposed to be a stop-gap measure that allowed people to get a functional graphics card for $200, when used versions of much slower cards were selling for a lot more. AMD should have at the very leased ceased production of the 4GB model after supply lines cleared up.
@defeqel6537
@defeqel6537 Місяць тому
@@TrueThanny Indeed, pandemic pricing for GDDR6 was about $15 / GB (while it is around $3 /GB now), extra 4GB would have cost about $60 more + OEM/retail margins which are often based on the price of the product (so about $80 more total)
@ivaniliev2272
@ivaniliev2272 Місяць тому
I will tell you why. Because game developers are lazy to optimize the games. VRAM should not affect much the framerate, but draw distance, texture resolution and mesh LODs(level of details). If the game is overfilling the GPU memory, it means that someone, somewhere did a really bad job and I am saying this as a game developer.
@KimBoKastekniv47
@KimBoKastekniv47 Місяць тому
You're the first channel I go to for day-1 reviews, but these "investigation / for science" videos is the cherry on top.
@madarcturian
@madarcturian Місяць тому
We need more videos about this. Thanks so much guys. Sad how low some cards are these days on vram. I pray I live to see afforadble cards with a lot of vram. VR and proper image scalers of image quality really need a lot of vram.
@Jomenaa
@Jomenaa Місяць тому
My GTX 1070 Ti still going strong, OC'd to GTX 1080 lvls of performance and those sweet 8GB's of GDDR5 :)
@pivorsc
@pivorsc Місяць тому
Doesent more vram prevents random shuttering? When i play CoD my GPU uses around 14gb of vram that i believe is a data preload to prevent loading from the drive.
@imo098765
@imo098765 Місяць тому
Its not VRAM reduce the random stuttering as much that you running out of VRAM you will introduce the stuttering. CoD just asks for everything possible and you wont see a difference on a 12GB, its just an "incase" we need it moment
@sebastian38488
@sebastian38488 Місяць тому
Definitely! On 24gb GPUs everything work smooth.
@andersjjensen
@andersjjensen Місяць тому
Very simplistically you can say "There are two amounts of VRAM: Enough and Not Enough". Having more than enough doesn't help with anything. Having less than enough hits 1% lows the hardest and the AVG progressively harder the bigger deficit you have. Until the whole Direct Storage thing is rolled out the system RAM is still being used as a staging area before dispatch to VRAM, rather than the VRAM being used as a buffer for things that are still (far) out of view. This means that game stutters can also occur if you're low on system RAM, despite having enough VRAM.
@andersjjensen
@andersjjensen Місяць тому
@@sebastian38488Not if you chuck them in a system with 8GB system RAM.
@nicktempletonable
@nicktempletonable Місяць тому
Loved this video Steve!
@Petch85
@Petch85 Місяць тому
Always try if you can run the game with the highest texture settings. Sometimes you get a lot of quality for no performans loss at all or very little performans loss.
@toad7395
@toad7395 Місяць тому
Its so scummy how NVIDIA refuses to give their cards more vram (and if they do it is absurdly overpriced)
@hasnihossainsami8375
@hasnihossainsami8375 Місяць тому
I think the biggest takeaway here is using higher quality textures almost never hurts performance, and so GPUs with insufficient VRAM should be avoided like the goddamn plague. 4GB cards are absolutely dead at this point, 6GB doesn't have much longer left and 8GB only falls short in some newer games and some edge cases in others. Buying an older 8GB card at discount/second hand prices still makes sense, imo. But newer GPUs? Considering the vast majority don't upgrade for atleast 3 years, they aren't worth it. 10GB is the minimum for a new GPU.
@itsyaboitrav5348
@itsyaboitrav5348 Місяць тому
You should compare running GPUs on PCIE gen 3 vs gen 4 next, looking at whether people on gen 3 boards are being held back on there gen 4 cards
@mashedindaed
@mashedindaed Місяць тому
Great video, I didn't realise VRAM had such an impact in performance once it had effectively been saturated. One slight critique in the charts, especially when talking about the percentage difference, is to always to compare in the same direction, otherwise the numbers could be misleading. For example, 15 is 50% more than 10 but 10 is 33.3% less than 15, so direction of travel between the numbers matters a lot because the difference between 33.3% and 50% is potentially massive. Alternatively an arrow on the charts to indicate which way you're comparing could help de-obfuscate the numbers.
@ravanlike
@ravanlike Місяць тому
Speaking of VRAM, yesterday one polish tech youtuber compared 3070 8GB vs 3070 16GB (modded, all memory dices were replaced). Findings were very similar to yours (when you compared 3070 8GB with one gpu for professionals (in reality 3070 chip with 16gb)). 1% low fps was higher with more VRAM, especially when running games with RT enabled. ukposts.info/have/v-deo/cZ2IfphpbKaGmXU.htmlsi=er0AK11pJyAEeMNC&t=172
@adink6486
@adink6486 Місяць тому
How is it so hard to for the fanboy to understand? Obviously you want to max out most of the graphical settings when you spend $600+ on a GPU. That's the whole point. We should never need worry about running out of VRAM if the company give us enough in the first place. Why would I want to lower my graphical settings so that I have enough VRAM to run the game. We want NVIDIA to treat us better. That's it.
@innie721
@innie721 Місяць тому
I want AMD to treat us better too. Imagine buying their highest end offering like the 7900 XTX only to get 25fps at 1440p with RT on medium in Alan Wake 2, a AAA title and one of the best titles of 2023. Scammed.
@defeqel6537
@defeqel6537 Місяць тому
@@innie721 AMD is irrelevant in the market, and memory subsystems are a "solved" problem: it would be trivial for nVidia to increase VRAM amount. Heck, with the 3080 they had to put in work to cut the VRAM amount from 12 to 10 GB, by disabling parts of the chip.
@MarcABrown-tt1fp
@MarcABrown-tt1fp Місяць тому
@@innie721 Again it isn't raytracing, its raytraced lighting and shadows. Nvidia may as well call it RTLS. Were it true raytracing the game would be unplayable. Its bad enough that "RT" has as bad a performance hit it does on Nvidia let alone AMD.
@alpha007org
@alpha007org Місяць тому
Thank you for not being upset, Steve. Mental stability is equally important as frametime stability.
@RetrOrigin
@RetrOrigin Місяць тому
This also helps demonstrate that texture quality/resolution doesn't really affect framerates that much if any as long as you have enough VRAM. I often see people turning texture quality down even when they have a video card with more than enough VRAM thinking that would help with performance when it usually doesn't.
@gregorsamsa555
@gregorsamsa555 Місяць тому
I guess 4GB RX 580 performs weaker than 8GB RX 570 in latest modern games...
@patricktho6546
@patricktho6546 Місяць тому
It was really worth it, to go for the 8 GB version of the R9 290 X, instead of the 4 GB version.
@hrayz
@hrayz Місяць тому
When my R9 290X-4GB card died I was able to get the RX 580-8GB card. That 8GB has allowed the card to survive to this day (good for 1080p gaming.) My friends and roommate are still using theirs, although I moved on to the RX 6900XT-16GB for 4k use.
@andersjjensen
@andersjjensen Місяць тому
I just ordered a Ryzen 7 7840U based laptop, which has 780M integrated graphics. The 780M and 6500XT are very evenly matched in terms of performance (+/- 10% for either depending on title). I opted to cough up the premium for 64GB (shared) RAM, as I tend to hold on to my laptops for a very long time (the one I'm typing this from is 12 years old). But that could very well come in handy, as shown here. HD texture packs for old games are a thing, and with more and more games targeting a Steam Deck preset I guess I'll actually be playing on this thing when I'm too lazy to get off my ass. What I find insane is that the 780M is a 15W solution while the 6500XT is a 107W solution. TSMC DUV N6 vs TSMC EUV N4 really makes one hell of a difference.
@ziokalco
@ziokalco Місяць тому
If I'm not mistaken. Games such as howarts legacy dinamicaly adjust real texture quality when the VRAM is saturated. Results may be missing some data
@OutOfNameIdeas2
@OutOfNameIdeas2 Місяць тому
Nvidia was the dumbest buying decision I made in years. I bought the 10gb 3080. It lasted me 1 month before it got limited by it's VRAM. 4k max settings are undoable. It's not even good enough to run beat saber with a index at 120fps.
@lorsch.
@lorsch. Місяць тому
Same, replaced it by a 3090. For VR 24GB is definitely needed.
@Azureskies01
@Azureskies01 Місяць тому
everyone with 3070s and 3080s are now on suicide watch
@Nurse_Xochitl
@Nurse_Xochitl Місяць тому
I still use a card with 4GB of VRAM. I'm pissed, not so much at NVIDIA for not including more VRAM (although I'm still pissed at them for different reasons, I use Linux and they have no open source drivers for the GTX series of cards)... but at the gaming industry as a whole. The only thing the gaming industry optimizes is monetization, not the games themselves. A prime example of this is "EOMM", a way to rig matchmaking to increase "engagement" which basically refers to how much people play and pay. Modern games do NOT use SBMM! SBMM is actually a good thing (game companies hate it because they can't milk players as well, and content creators/streamers do NOT tell the truth about it (along with toxic no-life "veteran" players) because they hate it that they can't "pubstomp" new players when SBMM is properly implemented. Content Creators/Streamer BTW are often paid shills, so when EOMM is brought up, they often act skeptical at best, or tell lies about it (and refer to it falsely as SBMM) likely to cover up the game companies' ass... because otherwise they could lose their partnership, get hit with lawsuits and DMCAs, etc. Combine that with grinding/progression/unlocks (and of course, "FOMO" limited-time events) and every player who doesn't spend a bunch of time grinding or spending money on "Pay To Progress" (which is Pay To win) crap like boosters, will always fall behind and be at a disadavantage, and will generally have a worse time gaming. Engagement Optimized Match Making: web.cs.ucla.edu/~yzsun/papers/WWW17Chen_EOMM) On top of that, I'd imagine there's probably some sketchy deals going on behind closed doors with hardware manufacturers and game companies. Perhaps game companies get new high-end hardware at a huge discount and/or are incentivized to "optimize" their games to run "well" only on the latest, highest end hardware (while not giving any older/weaker hardware any real optimization). (Optimized and well are in quotes because they don't mean to actually optimize the game and have it run well... just barely playable on only the newest, most powerful shit... so they can sell more hardware upgrades.) I would not be surprised if there was this much corruption in the gaming industry, as I have seen a lot of it personally as a gamer (and BTW, as a nurse - I can say the healthcare industry is also very corrupt with big government and big pharma lobbying) I'd imagine it's a somewhat similar deal here, as I follow the game industry somewhat closely. Heck, even the Australian government is defending microtransactions via state-owned media. In Defense of Microtransactions | THE LOOT DROP ABC Gamer ukposts.info/have/v-deo/qoWfg2OloayHu3U.htmlfeature=shared TBH, we really don't need more than 4GB of VRAM... if game companies would just optimize stuff. People could just say to buy a better card, but then it's only a matter of time before that card also becomes useless... which generates more and more e-waste. Not everyone can afford to do that anyway either. There has to be a point where people put their foot down and crack down on bad optimization. People need to stop buying new hardware, especially higher end hardware... and use stuff longer. They also need to stop supporting bad games (online-only/DRM'd games, games full of Grinding/P2W/Gambling, games without self hosted server support, games without modding support, etc.) Only then will we see a change in the industry.
@hmst5420
@hmst5420 Місяць тому
Great video quality btw. Looks like something's changed in your equipment
@LlywellynOBrien
@LlywellynOBrien 18 днів тому
Were the screen caps for the texture quality on Warhammer backwards? Just looked like the ultra one with the additional effects was on the right. Unless this is a CitySkylines 2 situation again.
@happybuggy1582
@happybuggy1582 Місяць тому
1060 3GB owners are crying
@alternaterealityfilms
@alternaterealityfilms Місяць тому
7800GTX 256MB owner here. Literally all you need
@kajmak64bit76
@kajmak64bit76 14 днів тому
1060 3gb was a mistake and should never have been made Or if it was to be made it should atleast have 4gb of VRAM since GTX 1050 ti has 4gb's like wtf
@dianaalyssa8726
@dianaalyssa8726 Місяць тому
Great video. Am a bit curious how the old 3gb (thinking about 1060 3gb) vs 4gb vs 8gb vs 10gb would compare. I consider 12 and up for new, main rigs, if the budget is there.
@ShaunRoselt
@ShaunRoselt Місяць тому
I'd love to see this, but on 4K. It's really interesting to see RAM usage.
@neurofiber2406
@neurofiber2406 Місяць тому
Another great video Steve. Since I'm sorely in need of a GPU upgrade...
@FreeWillMind
@FreeWillMind Місяць тому
Texture streaming is also one instance where a lack of VRAM is so painful, since pop ins are so obvious in terms of graphical glitches
@carlkidd752
@carlkidd752 Місяць тому
console/pc gaming is my hobby versus cars or camping or fishing. So I budget for the best I can afford at any particular time. My last and longest use GPU was EVGA 1080Ti FTW3 which my grandkids now have. Thru various monitors including my current Hisense 55U8G, a 4K 120hz TV, it performed outstandingly. When, briefly, the 7900XTX went on sale for $900, I upgraded. Very pleased with my HellHound and what I play doesn't have RT possible. As the 1080Ti also had no RT capability, I don't miss not having it. From your reviews and others, reasonably priced GPUs (under $600) have a tough time at 4K let alone enabling RT. For me, no RT and spending $500-550 for decent 1440/4k gaming makes a compelling buy AMD argument. Heck, imagine EVGA making new 1080Tis for $400. Want versus need. I didn't need a new GPU, but I wanted all the non RT eye candy at 4K. If your desire for "better" starts to overwhelm what you really need....skip the friday night trip to the bar, skip starbucks, my talk and text is $20 a month, ONE streaming service not 4 or 5 or 6 and shoot, in a couple months you could buy any GPU you want. I'm retired, so I already do those "skip" things which is why I could buy a $900 GPU. I could have bought a $1600 4090, but its RT functions would have been wasted. The $700 premium for a few more frames would equate to lighting cigars with 50 dollar bills. I don't need or want a Rolls Royce for trips to the grocery store.
@rxpacman1893
@rxpacman1893 Місяць тому
would've been good to hear how memory bus and memory type ie gddr6x play a part in all this, good vid but 👍
@milanbajic242pcchannel
@milanbajic242pcchannel Місяць тому
For me, in a game settings Global Textures are always the most important and only then other settings, I could play with High+ textures and everything else on Medium 60 FPS without any problems, that is my MINIMUM below which I would not go, My PC: 1080p, R5 5600X, (2x8) 16GB 3200MHz 16CL, RTX 2070 8GB, for now all working as I wish (playing games with all on HIGH+ settings), thanks for amazing video, have a awesome day Hardware Unboxed. 👍❤💯🔔
@timduck8506
@timduck8506 Місяць тому
Im so glad I brought a rtx 3080 16gb laptop version for my travels 3 years ago. total specs are 32gb ram and a 5900hx cpu with 8tb of storage.😃
@johnk.7836
@johnk.7836 Місяць тому
Appreciate you sharing the benchmarks and the knowledge - this is how we learn and become better shoppers and consumers.
@mleise8292
@mleise8292 Місяць тому
24:59 While you are right that the Vega56 wasn't an affordable 8 GB card originally, it was still the latest AMD offering when it came down in price to 260€ (VAT _not_ included) in the first half of 2019. I still use mine. 😅
@kicsiduda
@kicsiduda Місяць тому
Very well made video, thank you
@dot_boi
@dot_boi Місяць тому
I'm glad you are bringing this issue to the table. For Linux folk using Proton VRAM becomes even more of an issue. While software can improve the shaders that are compiled to be more optimized, I think due to how the technology works by nature it will always use more VRAM when its translated. One specific example is Phasmophobia, its honestly a very easy game to run on Windows, I get 180FPS+ constantly. On Linux, its a different story, the VRAM demand is around twice as much (Over 8GB!!!!) forcing me to use lower texture quality even though the GPU core is fully capable of running the game at high frame rates.
@EmblemParade
@EmblemParade Місяць тому
To be honest, this increasing appetite for VRAM happened faster than most of us expected. Some devs were signaling that this would happen for a while, but we assumed that the baseline represented by PS5 and Xbox Series X would limit requirements for PC, too. The takeaway is that we can't assume that anymore. Devs are targeting a higher resource profile for PC gaming than for consoles, period.
@ace100hyper3
@ace100hyper3 Місяць тому
Great job as always.
@rangersmith4652
@rangersmith4652 Місяць тому
My first 100% home-assembled PC sported an FX-6300 and an R0-270X 4BG. Yes, even way back in 2014, going AMD meant we could opt for extra VRAM.
@kumbaj1612
@kumbaj1612 Місяць тому
when did u add imortals of aveum, first itme i saw it was here now and it looks amazing..might play it
@KITOMERO
@KITOMERO Місяць тому
*Cries in buyer's remorse after buying a brand-new "RTX" gaming laptop with 6 GB of VRAM*
@mitchellgould7405
@mitchellgould7405 Місяць тому
Whilst I understand using your usual test platform of 7800x3d with ddr5 as it mitigates cpu/platform "bottleneck", dialing it back to a 5800x3d on ddr4 would have been a fairer pairing with a entry level GPU. I am sure this would have shown a greater divide between 4 and 8gig results as the memory would have been less capable when taking the texture load. Keep up the good content guys.
@aaron_333
@aaron_333 Місяць тому
Very nice! I wonder if you can do one which compares 10GB, 12GB and 16GB?
@tech6294
@tech6294 Місяць тому
Great video as usual! ;)
@rightwingsafetysquad9872
@rightwingsafetysquad9872 Місяць тому
Meanwhile in laptops, a 4070 costs $250-$300 MORE than a 4060 and both come with 8GB of memory. I remember buying an RX 480 with 8GB for just $250 over 7 years ago. Even back then we were saying that 4GB wasn't enough for 1440p.
@jimmyjiang3413
@jimmyjiang3413 Місяць тому
It reminds me why professional (Quadro) RTX and Radeon Pros utilize double VRAM buffer. I am not sure whether or not something like RTX 4000 SFF (Ada) worth it for a given SFF build due to VRAM buffer reasons or simply performance per watt. 🤷🏻‍♀️
@SPPACATR
@SPPACATR Місяць тому
Dude sounds sooooo chill in this video lol.
@ipotato95
@ipotato95 Місяць тому
This would be a great video to revisit with 8vs12 when the next gen of graphics cards are announced
@paulbrooks4395
@paulbrooks4395 Місяць тому
It's important to gauge the value of memory amounts by years between upgrade cycles. For people who wait 4-6, getting more allows the product to age more gracefully. For those who replace every other generation or less, it's not as much of a factor, but has been material unexpectedly in the last couple years. The problem for me is that anything that can't, on high settings now, retain the data in VRAM, will affect people and "push" them to upgrade or experience dissatisfaction before the card is ready to be recycled. This will lead to less reuse and more consumption. Technology that can last longer for simple reasons and be sold on the second hand market for another, happy life, is a net win to everyone and the environment.
@Nurse_Xochitl
@Nurse_Xochitl Місяць тому
I wish people would quit shilling high end/more powerful hardware. What they're really doing is hurting people who can't afford said hardware by normalizing people having more powerful hardware... which gives devs less of a reason to optimize. Less optimization = more hardware upgrades needed = more e-waste. It would do wonders if hardware just stopped getting faster altogether.
@CameraObscure
@CameraObscure Місяць тому
In 2016, I brought the RX480 8GB (£220) new, when it released because I knew about upcoming VRAM issues as games in the pipeline were going to use higher textures and that paid off for me. Just glad I am technically knowledgeable enough to follow technical developments and implications following implementation of said technologies. Following year brought a R7 1700 when that released. That combo served me well until I upgraded earlier this year.
@_Jayonics
@_Jayonics Місяць тому
I know the purpose of this video is what is needed and that graphics cards need more memory but for cases where either buying a new graphics card (like if its soldered in the case of a laptop) is not an option or just not something you want to do... What other options do you have to improve your performance in VRAM constrained environments? On the settings and software side, what settings should you change to attempt to keep VRAM usage low, just textures? Is it worth using DLSS and FSR to render at lower resolutions while putting more processing overhead onto the GPU? When exceeding the VRAM buffer the system RAM is used. How much benefit would you get from upgrading your DRAM to something of higher density, higher frequency, lower latency E.C.T? I think how the system RAM affects graphics performance is going to become increasingly important as the AMD Strix Point and Strix Point Halo APUs take market share from low end GPUs. And I suspect Intel will follow suit with Battlemage APUs considering Iris XE APUs exist. In this scenario DRAM is the only source of VRAM so has even more impact
Top 5 ways you're WASTING money on with your PC!
17:43
JayzTwoCents
Переглядів 912 тис.
It's not about the VRAM!
16:24
TechDweeb
Переглядів 22 тис.
ВИРУСНЫЕ ВИДЕО / Мусорка 😂
00:34
Светлый Voiceover
Переглядів 5 млн
I Tried a Disney Secret Project!
11:33
Marques Brownlee
Переглядів 2 млн
The Best GPUs of 2024 So Far - March GPU Pricing Update
16:51
Hardware Unboxed
Переглядів 145 тис.
Llama 1-bit quantization - why NVIDIA should be scared
6:08
George Xian
Переглядів 18 тис.
Storage Media Life Expectancy: SSDs, HDDs & More!
18:18
ExplainingComputers
Переглядів 276 тис.
Simmerstats: The genius old tech that controls your stovetop
36:31
Technology Connections
Переглядів 619 тис.
How Slow Is The Ryzen 5 5600 For 2024 Gaming?
19:13
Hardware Unboxed
Переглядів 147 тис.
Game Boy games that did the impossible.
15:33
Modern Vintage Gamer
Переглядів 180 тис.
هل كروت 8GB غير كافية في 2023 ؟ وكم تحتاج VRAM ؟
12:41
سنكرة SANKARA
Переглядів 184 тис.
The Best Gaming GPU Ever Released, Nvidia GeForce GTX 1080 Ti, 2024 Revisit
14:37
ИГРОВОЙ ПК c WILDBERRIES за 40 тысяч рублей
30:17
Ремонтяш
Переглядів 465 тис.
Что если бы Apple делала зубные щётки?
0:59
Самая редкая видеокарта от SONY
13:51
Nitroxsenys
Переглядів 53 тис.
How to get a message from a developer? #standoff #system #scam
0:53
Standoff 2 Live
Переглядів 501 тис.