Asus 8800 Gt 1 Gb

It's funny hearing people say it's not worth it for the extra memory. These are the people with 512MB cards that have no idea what the extra memory can actually do.

I have a 768MB card, and I can use the maximum widescreen resolution in Quake 4, the extreme settings (the ones that give you the warnings of the amount of memory required), crank all the details, max the AA and it still flies. Try that on a 512MB card. The extra memory goes a long way for things like that. Don't listen to them, because they have NO idea. NO practical experience whatsoever. The extra memory does make a difference, and yes, it's worth it.
 
512mb is enough for todays games, at high resolutions. In fact it's the new standard, 256mb is becoming defunct. 1Gb of Vram, and the 768 on the GTX for that matter, is overkill, even in crysis on a 1920 x 1080 resolution. So say all the benchmarks, so says me (I).

Overkill can be good though, it's allows some breathing room in future apps, effectively eliminates the chance of the memory being a bottleneck, etc. Just depends on if you're willing to pay the premium for the extra leverage that will come in handy, for the most part, at a subsequent time.
 
Last edited:
It's funny hearing people say it's not worth it for the extra memory. These are the people with 512MB cards that have no idea what the extra memory can actually do.

I have a 768MB card, and I can use the maximum widescreen resolution in Quake 4, the extreme settings (the ones that give you the warnings of the amount of memory required), crank all the details, max the AA and it still flies. Try that on a 512MB card. The extra memory goes a long way for things like that. Don't listen to them, because they have NO idea. NO practical experience whatsoever. The extra memory does make a difference, and yes, it's worth it.

Yes it makes a difference if you are using a workstation card, HOWEVER if you actually look at the facts, you will see that in the case of the 8800GT, having 1GB over 512MB makes very little difference.

Now please tell me you aren't the type of person who thinks it's worth it to get a 512MB version of the 8400 or 7300. Those cards as well as others simply do not have the capability to run games on high enough settings with AA/AF, and at a high resolution, so the extra memory is simply not used.
 
what facts? The experience you don't have? Or the fact that you have this quad with two cores sitting idle 3/4 of the time and a wimpy VGA because you spent all your cake on a CPU you hardly use, just to make a bigger brag in your sig? We'll pretend not to notice that your VGA is bottlenecking your CPU and RAM, such that it's absolutely pointless to waste all that money, including liquid cooling. Would have made more sense to spend it on a better VGA so that you could actually speak from experience, rather than wasting it on stuff that won't shine for another 2 years, by which time the 45nm will be chewing up the Qs and spitting them out for a snack.

It makes enough difference to warrant the $50 in games with uber large textures, such as Quake 4 and newer games. A lot more sense than an aging Q6600 that gets spanked by the E6700 in everything but the synthetic benchmarks you seem to favor so much...because we all know how inidicative they are of real world performance... Just like your HDtach fiasco.. not. lol :D

Nice strawman in the last paragraph by the way... And that's as far as I'll go with that.
 
Last edited:
what facts? The experience you don't have? Or the fact that you have this quad with two cores sitting idle 3/4 of the time and a wimpy VGA because you spent all your cake on a CPU you hardly use, just to make a bigger brag in your sig?

It makes enough difference to warrant the $50 in games with uber large textures, such as Quake 4 and newer games. A lot more sense than an aging Q6600 that gets spanked by the E6700 in everything but the synthetic benchmarks you seem to favor so much...because we all know how inidicative they are of real world performance... Just like your HDtach fiasco.. not. lol :D

Nice strawman in the last paragraph by the way... And that's as far as I'll go with that.
First, I don't see how you consider the 2900XT a "wimpy" card. I bought it when it was first released early last summer, and even today it's still one of ATI's best cards, since it's very similar performance wise to the 3870, and even outperforms it in a few games because of it's 512-Bit memory bus.

I bought the Q6600 to upgrade from my E6400, the reason I chose the Q6600 was because it's a great overclocker, as you can see from my sig I have it running stable at 3.85GHz (3.21GHz was the max on my E6400), which will outperform ANY processor out there on the desktop market at stock. Sure I am not always using all the cores, however I do a fair amount of multi-tasking. Often times I will be compressing large files with WinRAR (which uses 4 cores), while doing various internet browsing, media playing, and possibly even using Photoshop at the same time.

And no it does not warrant the cost difference! The 512MB version performs nearly identical, and even better in some cases! You seriously need to do some searching before bashing people here.

3dmark.png


unrealtournament.png


crysis.png


lostplanet.png


Remember, the more memory you have doesn't always mean it's better. In this case, the 512MB version pulled ahead partially because the memory chips they used have lower latencies then there higher capacity siblings.
 
Last edited:
Sorry, was that the 8800GT 1024MB overclocked that sits at the top of EVERY one of the 1600x1200 benchmarks in your examples? Even the useless synthetic ones? Yes, in fact it is.
 
Sorry, was that the 8800GT 1024MB overclocked that sits at the top of EVERY one of the 1600x1200 benchmarks in your examples? Even the useless synthetic ones? Yes, in fact it is.
wow, you really don't know how to read these benchmarks.

At stock speeds, the 8800GT 512MB outperforms it's 1GB counterpart in almost every test, even at the high 1600x1200 resolutions. The only time the 1GB variant outperforms it is in the Lost Planet test in the cave. And the reason the 1GB overclocked model does better is because it is at a higher core/memory speed. Even so, it only outperforms the 512MB variant anywhere from 0.5-3 FPS!!
 
[-0MEGA-];883484 said:
wow, you really don't know how to read these benchmarks.

At stock speeds, the 8800GT 512MB outperforms it's 1GB counterpart in almost every test, even at the high 1600x1200 resolutions. The only time the 1GB variant outperforms it is in the Lost Planet test in the cave. And the reason the 1GB overclocked model does better is because it is at a higher core/memory speed. Even so, it only outperforms the 512MB variant anywhere from 0.5-3 FPS!!

I agree, the 8800GT 512MB is a better buy, but I've already argued with SirKenin enough in a previous thread about the 8800GTX VS 8800GT.
 
[-0MEGA-];883484 said:
wow, you really don't know how to read these benchmarks.

At stock speeds, the 8800GT 512MB outperforms it's 1GB counterpart in almost every test, even at the high 1600x1200 resolutions. The only time the 1GB variant outperforms it is in the Lost Planet test in the cave. And the reason the 1GB overclocked model does better is because it is at a higher core/memory speed. Even so, it only outperforms the 512MB variant anywhere from 0.5-3 FPS!!

Common sense tells me that more memory = better performance at high resolutions..

I agree, the 8800GT 512MB is a better buy, but I've already argued with SirKenin enough in a previous thread about the 8800GTX VS 8800GT.

I can't say I blame him... Nvidia did us GTX owners dirty with the GT release... :D

I would agree it's the better buy, why spend $250 more on +3FPS
 
Last edited:
Umm. Dude.. The dark green at the top is the 1GB overclocked version, the dark blue the 512MB overclocked version.. and at 1600x1200 the 1GB version sits on top in every single one of your stupid benchmarks.. which was my original point. At the highest resolutions and eye candy, the more memory comes out ahead with the large textures (for incredibly obvious reasons, unless you're trying to justify a 2900XT bottleneck)
 
Common sense tells me that more memory = better performance at high resolutions..



I can't say I blame him... Nvidia did us GTX owners dirty with the GT release... :D

I would agree it's the better buy, why spend $250 more on +3FPS

Look here:

http://www.hardwarecanucks.com/foru...ts-512mb-g92-alpha-dog-edition-review-10.html

DX 10 benchmarks, where the G92 GTS gets it's butt kicked. Reality is that the GTX is still the king of the castle (save for the Ultra). And why spend the money? Because I can... That's all...and because my rig will smoke that pathetic quad with a 2900XT in real world performance. Which is what matters unless you spank to 3dmark and HD Tach all day. lol :D
 
Umm. Dude.. The dark green at the top is the 1GB overclocked version, the dark blue the 512MB overclocked version.. and at 1600x1200 the 1GB version sits on top in every single one of your stupid benchmarks.. which was my original point. At the highest resolutions and eye candy, the more memory comes out ahead with the large textures (for incredibly obvious reasons, unless you're trying to justify a 2900XT bottleneck)

Agreed....

2900XT isn't that bad of a card...

http://www.gpureview.com/show_cards.php?card1=518&card2=474#
 
Umm. Dude.. The dark green at the top is the 1GB overclocked version, the dark blue the 512MB overclocked version.. and at 1600x1200 the 1GB version sits on top in every single one of your stupid benchmarks.. which was my original point. At the highest resolutions and eye candy, the more memory comes out ahead with the large textures (for incredibly obvious reasons, unless you're trying to justify a 2900XT bottleneck)
I know that, I posted the benchmark images here.

My point is that you have gone from trying to persuade us that the 1GB model is worth $50 more, to trying to prove that it's better in the higher resolution test. Lets not forget however that the 512MB version performed better then the 1GB version is every test done at the more common 1280x1024 resolutions, and when comparing the cards at their stock speeds, the performance difference at 1600x1200 is anywhere from only 0.5FPS - 3FPS.

Now you tell me, is it worth $50 more to gain an extra 0.5FPS - 3FPS in high resolution games, and lose performance in lower resolution games?

I am not trying to convince you that the 512MB version performs better (since overall the 1GB version is better), however as I've said before, when you weigh out the pros and cons of each card, you will realize that having 1GB of memory on a card such as the 8800GT just isn't worth the extra cost.
 
Last edited:
Look here:

http://www.hardwarecanucks.com/foru...ts-512mb-g92-alpha-dog-edition-review-10.html

DX 10 benchmarks, where the G92 GTS gets it's butt kicked. Reality is that the GTX is still the king of the castle (save for the Ultra). And why spend the money? Because I can... That's all...and because my rig will smoke that pathetic quad with a 2900XT in real world performance. Which is what matters unless you spank to 3dmark and HD Tach all day. lol :D

Just on a side note, what do you think of the HD3870?
 
Back
Top