bm23
Active Member
It seems that everyone is clamoring for at least 4gb of VRAM on GPUs, but from my experience, VRAM has not been so crucial (my stats below). I understand that many people game at higher resolutions than 1080p, but even at 5760x1080, GTX 770 2gb SLI is still doing a damn fine job (LINK). So, my question is, are gamers too caught up with VRAM? How much difference would it make in real world gameplay? The reason I'm asking is that this rush for more VRAM doesn't quite add up with benchmarks. Perhaps I'm missing something. Either way, maybe this discussion can prove useful for those who feel pressured to upgrade to GPUs with more VRAM.
My current rig:
GTX 770 2gb SLI
i7 5960x
32gb 2400mhz DDR4
Games fps:
Shadow of Mordor: 30 @ 4K, ultra | 50-60 @ 3K, ultra | 85-95 @ 1080p, ultra
Witcher 3: 35-45 @ 2715x1527, ultra, NVidia Hairwork Low | 45-60 @ 1080p, ultra, NVidia Hairwork Low
Tomb Raider: 75-100 @ 2715x1527, ultra with TressFX
GTA 5: 60 @ 1080p (mostly maxed out, except for those few advance settings. I haven't played in a while so I don't remember exactly)
Metro Last Light: 40-45 @ 1080p, maxed out
Dragon Age Inquisition: 40-50 @ 2715x1527, maxed out | 70-90 @ 1080p, maxed out
Dying Light: 70-100 @ 2715x1527, maxed out
My current rig:
GTX 770 2gb SLI
i7 5960x
32gb 2400mhz DDR4
Games fps:
Shadow of Mordor: 30 @ 4K, ultra | 50-60 @ 3K, ultra | 85-95 @ 1080p, ultra
Witcher 3: 35-45 @ 2715x1527, ultra, NVidia Hairwork Low | 45-60 @ 1080p, ultra, NVidia Hairwork Low
Tomb Raider: 75-100 @ 2715x1527, ultra with TressFX
GTA 5: 60 @ 1080p (mostly maxed out, except for those few advance settings. I haven't played in a while so I don't remember exactly)
Metro Last Light: 40-45 @ 1080p, maxed out
Dragon Age Inquisition: 40-50 @ 2715x1527, maxed out | 70-90 @ 1080p, maxed out
Dying Light: 70-100 @ 2715x1527, maxed out