Leaked 480 benchmark.

Status
Not open for further replies.
You should read this article.
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=1

Basically the point is no current game will take advantage of the extra tessellation.

This is a quote,
"But with that said, NVIDIA having a more powerful tessellator doesn’t mean much on its own. Tessellation is wholly dependent on game developers to make use of it and to empower users to adjust the tessellation levels. Currently every DX11 game using tessellation uses a fixed amount of it, so NVIDIA’s extra tessellation abilities are going unused. This doesn’t mean that tessellation will always be used like this, but it means things have to change, and counting on change is a risky thing."

Reyeon, your quote "It just.. it saddens me that people dont see whats on the inside, beyond. Even with 5970's higher fps and with 480's high temps. I'll say Nvidia won this round and deserve more than ever before for making a revolution in gaming and possibilities for future gaming as well."

PFS and high temps are valid thoughts. Consumers are winning this round and it only happened because AMD pushed the envelope so far with the 5000 cards. I will buy a 470 or 480 but I do not see this as a game changer, I just like options and both look to have strong points.
 
You should read this article.
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=1

Basically the point is no current game will take advantage of the extra tessellation.

This is a quote,
"But with that said, NVIDIA having a more powerful tessellator doesn’t mean much on its own. Tessellation is wholly dependent on game developers to make use of it and to empower users to adjust the tessellation levels. Currently every DX11 game using tessellation uses a fixed amount of it, so NVIDIA’s extra tessellation abilities are going unused. This doesn’t mean that tessellation will always be used like this, but it means things have to change, and counting on change is a risky thing."

Reyeon, your quote "It just.. it saddens me that people dont see whats on the inside, beyond. Even with 5970's higher fps and with 480's high temps. I'll say Nvidia won this round and deserve more than ever before for making a revolution in gaming and possibilities for future gaming as well."

PFS and high temps are valid thoughts. Consumers are winning this round and it only happened because AMD pushed the envelope so far with the 5000 cards. I will buy a 470 or 480 but I do not see this as a game changer, I just like options and both look to have strong points.

When the summer comes, i can promise that there will be more games with the extra tesselation.. I'm without doubt.

I just want to let everybody know that i'm taking a minor break for a while. All this thinking, replying, comparing etc etc is ruining my mind. I currently have vecation and i'll take advantage of that and use it.. I felt great yesterday. Now im 100% totally exhausted.. Have a nice day
 
You should read this article.
http://www.anandtech.com/video/showdoc.aspx?i=3783&p=1

Basically the point is no current game will take advantage of the extra tessellation.

This is a quote,
"But with that said, NVIDIA having a more powerful tessellator doesn’t mean much on its own. Tessellation is wholly dependent on game developers to make use of it and to empower users to adjust the tessellation levels. Currently every DX11 game using tessellation uses a fixed amount of it, so NVIDIA’s extra tessellation abilities are going unused. This doesn’t mean that tessellation will always be used like this, but it means things have to change, and counting on change is a risky thing."

Reyeon, your quote "It just.. it saddens me that people dont see whats on the inside, beyond. Even with 5970's higher fps and with 480's high temps. I'll say Nvidia won this round and deserve more than ever before for making a revolution in gaming and possibilities for future gaming as well."

PFS and high temps are valid thoughts. Consumers are winning this round and it only happened because AMD pushed the envelope so far with the 5000 cards. I will buy a 470 or 480 but I do not see this as a game changer, I just like options and both look to have strong points.

Yea I agree with the games staying without, they will cut off a big chunk of the market beacuse those with ATi cards won't be able to fully utilise the game (apparently :rolleyes:) and because to do so would, as you say, mean change and change in any industry is a huge risk

When the summer comes, i can promise that there will be more games with the extra tesselation.. I'm without doubt.

I just want to let everybody know that i'm taking a minor break for a while. All this thinking, replying, comparing etc etc is ruining my mind. I currently have vecation and i'll take advantage of that and use it.. I felt great yesterday. Now im 100% totally exhausted.. Have a nice day

finger cramp :P
 
:eek:.....damn hate to see the guys electric bill who has 480s in SLI :D

Thankfully,according to Toms Hardware my 4890 only draws 178W idle and 285w Full load.
What??? It should only draw ~50 watts idle. Or is that whole system?

So, in summary the GF100 is a massive failure for the average consumer. Ridiculous power consumption and heat. High pricing. Not a real advantage in games over the Radeon 5000 series. We waited 6 months... for that?
 
Last edited:
What??? It should only draw ~50 watts idle. Or is that whole system?

So, in summary the GF100 is a massive failure for the average consumer. Ridiculous power consumption and heat. High pricing. Not a real advantage in games over the Radeon 5000 series. We waited 6 months... for that?

TH figures sound like whole system, [h]ocp sounds like its for the cards only.
 
Yes that is total system. 5870's in CF uses 598 watts and the GTX480 in SLI uses 851 watts. That is a pretty big difference.
 
Yes that is total system. 5870's in CF uses 598 watts and the GTX480 in SLI uses 851 watts. That is a pretty big difference.

Yup, but you should know by now nvidia doesnt know how to make power effecient designs:P

The GTX295 and 5970 are both dual GPU cards, and both draw less power than a single GTX480. Performance/watt/price on this card is horrible.
 
GTX 480 receives OCC-Gold award for the best Single GPU

Price is not listen under cons because this card is in fact underpriced if you consider all the extra features.

Source:

http://www.overclockersclub.com/reviews/nvidia_gtx480/16.htm

Conclusion:

When you get right down to it the GTX 480 offers up better performance than the HD 5870. That's the expectation the world had for this card. In 44 out of 48 tests run the GTX 480 delivered a higher level of performance, a pretty stout performance. In the four tests that it did not outright win, two showed performance equal to the HD 5870 and the two it lost were not by a large margin. With those kind of performance results I have to say that NVIDIA delivered a card that did what it was meant to do, deliver a higher level of performance. This was more evident in the newer games and DirectX 11 game and benchmark results where the GTX 480 cleaned house. The scoring in the Unigine 2.0 benchmark shows the strengths of the Fermi architecture with scores from the GTX 480 finishing almost 100% higher than the results of the comparison HD 5870 when the extreme tessellation preset is chosen. Metro 2033 testing showed that the performance in the Unigine testing was no fluke. The tesselation performance is a result of the all new Polymorph Tesselation engines that reside in each GPU cluster. Much of the early talk about the Fermi third generation Streaming Multiprocessor architecture was geared toward GPU computing, but make no mistake, this is a video card built for gaming as shown by the results. However, there is so much more that this card can be used for besides gaming; there are an abundance of GPU accelerated applications to make your life easier, such as Badaboom, Vreveal, WinZip, Photoshop and more. For those into the distributed computing scene there is a client that takes advantage of the massive parallel architecture to really push your contributions higher to hopefully help find a cure for some really heinous diseases. NVIDIA's stereoscopic 3D Vision system is not new to the market but supporting it over three monitors is a whole new way to enjoy this technology. When running with three monitors you have what is called 3D Vision Surround. If you don't want to use NVIDIA's 3D Vision system
you still can enjoy a surround experience with GT 200 and higher based video cards. The downside is that to run the surround setup you need to run two cards in SLI. If you are going this route you still have the monitor purchase but you just need two cards to really have the horsepower to drive the 746 million pixels per second in a 3DSurround setup. That does add to the cost but really, if you are going that way you have some cash to get there. Pricing is expected to be in the $499 range, or about 50 to 80 dollars more than ATI's HD 5870. Steep but the price point is going to be expected and puts NVIDIA at a point where ATI may not cut prices, making this a bad situation for consumers. Time will tell though.

When it came time to overclock the GTX 480 I was able to get a decent clock speed increase out of the card that showed nice increases in gaming scores across the board. There weren't any utilities already out but EVGA will have its Precision overclocking tool available that gives you the ability to push the clock speeds on the GTX 480. The clock speeds I reached amount to a 15% increase in the Core/Shader speed, from 1401MHz to 1608MHz, and an 11.5% increase on the memory clock speeds, from 1848MHz to 2115MHz. However, to reach this level of performance you need to make sure you have at least a 600 watt power supply with a native 6 and 8-pin PCI-E power connector. Max power consumption for the board is rated at 250 watts. I only saw close to that number while overclocked, with a total system consumption of 451 watts. At idle, the system consumes 206 watts. At stock speeds, the power consumption was about 25 watts lower at 424 watts. The cooling solution
used on the GTX 480 looks pretty stout but with fan speeds left at auto the card heats up fast. I saw temperatures over 100 degrees Celsius using Furmark with the fan speeds on auto. Bump the fan speed to 100% and you get temperatures in the mid 60C range. However, you do have a noise penalty when doing this. At a fan speed of 70% I found a good solid balance between noise and temperatures. 80 Celsius is where the temperature peaked in my well ventilated Stacker 810 case. This put me a good 25 Celsius away from the maximum safe temperature. Just make sure your case is well ventilated or you may see the temperatures of the other components in the system increase and cause you other heat related concerns. Cooling those three billion transistors and 480 cores is gonna take some work.

ATI has filled its product stack from top to bottom so NVIDIA has its work cut out for itself, filling up its stack to compete with ATI at all price points. To achieve this, NVIDIA built a scalable architecture that uses GPU clusters so you can drop clusters (four on the GTX 480) to reach a performance and price point. It will be interesting to see how NVIDIA fills out its DirectX 11 portfolio. All things considered, NVIDIA stepped up to the plate (albeit rather late) and delivered gaming performance with visual quality. While the cards do not hit stores until the week of April 12, nVidia has assured us of an ample supply of cards available on launch.


Pros:

* Performance
* Overclocking
* DX 11 performance
* 3D Vision and Surround supported
* Direct Compute
* 32x CSAA
* Power consumption
* Cooling solution
* Productivity increase with CUDA apps
* Ray tracing
* PhysX
* Competitive price point


Cons:

* 3D Vision Surround needs two video cards
* Hot running
* Fan noise at full speed

OCC-Gold.png
 
Last edited:
GTX 480 receives OCC-Gold award for the best Single GPU

Price is not listen under cons because this card is in fact underpriced if you consider all the extra features.

Source:

http://www.overclockersclub.com/reviews/nvidia_gtx480/16.htm

Conclusion:

When you get right down to it the GTX 480 offers up better performance than the HD 5870. That's the expectation the world had for this card. In 44 out of 48 tests run the GTX 480 delivered a higher level of performance, a pretty stout performance. In the four tests that it did not outright win, two showed performance equal to the HD 5870 and the two it lost were not by a large margin. With those kind of performance results I have to say that NVIDIA delivered a card that did what it was meant to do, deliver a higher level of performance. This was more evident in the newer games and DirectX 11 game and benchmark results where the GTX 480 cleaned house. The scoring in the Unigine 2.0 benchmark shows the strengths of the Fermi architecture with scores from the GTX 480 finishing almost 100% higher than the results of the comparison HD 5870 when the extreme tessellation preset is chosen. Metro 2033 testing showed that the performance in the Unigine testing was no fluke. The tesselation performance is a result of the all new Polymorph Tesselation engines that reside in each GPU cluster. Much of the early talk about the Fermi third generation Streaming Multiprocessor architecture was geared toward GPU computing, but make no mistake, this is a video card built for gaming as shown by the results. However, there is so much more that this card can be used for besides gaming; there are an abundance of GPU accelerated applications to make your life easier, such as Badaboom, Vreveal, WinZip, Photoshop and more. For those into the distributed computing scene there is a client that takes advantage of the massive parallel architecture to really push your contributions higher to hopefully help find a cure for some really heinous diseases. NVIDIA's stereoscopic 3D Vision system is not new to the market but supporting it over three monitors is a whole new way to enjoy this technology. When running with three monitors you have what is called 3D Vision Surround. If you don't want to use NVIDIA's 3D Vision system
you still can enjoy a surround experience with GT 200 and higher based video cards. The downside is that to run the surround setup you need to run two cards in SLI. If you are going this route you still have the monitor purchase but you just need two cards to really have the horsepower to drive the 746 million pixels per second in a 3DSurround setup. That does add to the cost but really, if you are going that way you have some cash to get there. Pricing is expected to be in the $499 range, or about 50 to 80 dollars more than ATI's HD 5870. Steep but the price point is going to be expected and puts NVIDIA at a point where ATI may not cut prices, making this a bad situation for consumers. Time will tell though.

When it came time to overclock the GTX 480 I was able to get a decent clock speed increase out of the card that showed nice increases in gaming scores across the board. There weren't any utilities already out but EVGA will have its Precision overclocking tool available that gives you the ability to push the clock speeds on the GTX 480. The clock speeds I reached amount to a 15% increase in the Core/Shader speed, from 1401MHz to 1608MHz, and an 11.5% increase on the memory clock speeds, from 1848MHz to 2115MHz. However, to reach this level of performance you need to make sure you have at least a 600 watt power supply with a native 6 and 8-pin PCI-E power connector. Max power consumption for the board is rated at 250 watts. I only saw close to that number while overclocked, with a total system consumption of 451 watts. At idle, the system consumes 206 watts. At stock speeds, the power consumption was about 25 watts lower at 424 watts. The cooling solution
used on the GTX 480 looks pretty stout but with fan speeds left at auto the card heats up fast. I saw temperatures over 100 degrees Celsius using Furmark with the fan speeds on auto. Bump the fan speed to 100% and you get temperatures in the mid 60C range. However, you do have a noise penalty when doing this. At a fan speed of 70% I found a good solid balance between noise and temperatures. 80 Celsius is where the temperature peaked in my well ventilated Stacker 810 case. This put me a good 25 Celsius away from the maximum safe temperature. Just make sure your case is well ventilated or you may see the temperatures of the other components in the system increase and cause you other heat related concerns. Cooling those three billion transistors and 480 cores is gonna take some work.

ATI has filled its product stack from top to bottom so NVIDIA has its work cut out for itself, filling up its stack to compete with ATI at all price points. To achieve this, NVIDIA built a scalable architecture that uses GPU clusters so you can drop clusters (four on the GTX 480) to reach a performance and price point. It will be interesting to see how NVIDIA fills out its DirectX 11 portfolio. All things considered, NVIDIA stepped up to the plate (albeit rather late) and delivered gaming performance with visual quality. While the cards do not hit stores until the week of April 12, nVidia has assured us of an ample supply of cards available on launch.


Pros:

* Performance
* Overclocking
* DX 11 performance
* 3D Vision and Surround supported
* Direct Compute
* 32x CSAA
* Power consumption
* Cooling solution
* Productivity increase with CUDA apps
* Ray tracing
* PhysX
* Competitive price point


Cons:

* 3D Vision Surround needs two video cards
* Hot running
* Fan noise at full speed

OCC-Gold.png

And? The 5870 received OCC gold award as well...

Also, dont be double posting this stuff. If your gonna post it in here, dont go making a thread on it as well.
 
And? The 5870 received OCC gold award as well...

Also, dont be double posting this stuff. If your gonna post it in here, dont go making a thread on it as well.

Yea. But HD 5870 lost 44 out of 48 tests versus a "DRIVERLESS" NOT optimized-for-any-games-yet-filled-with-probably-a-ton-of-fps-problems-due-to-no-drivers GTX 480.

Drivers can improve 5-45% fps in a sigle-adressed game.

Here are some driver notes from Nvidia proving my statement:

Patch 197.13 for the GTX 2xx series

o Up to 13% performance increase in Crysis: Warhead with a single
o Up to 30% performance increase in Crysis: Warhead with SLI
o Up to 13% performance increase in H.A.W.X with single GPU
o Up to 15% performance increase in H.A.W.X with SLI technology
o Up to 30% performance increase in Left 4 Dead with single GPU
o Up to 28% performance increase in Left 4 Dead with SLI technology

Now, i've seen patched with higher fps increase than those. Yet, ATi has already patched those (old games i know but..) you should get my point already.

Take a benched GTX 480 in a released game with 5% over 5870 and add 13-30% more (check the performance increase in Left 4 dead with a single GPU)
 
Yea. But HD 5870 lost 44 out of 48 tests versus a "DRIVERLESS" NOT optimized-for-any-games-yet-filled-with-probably-a-ton-of-fps-problems-due-to-no-drivers GTX 480.

Drivers can improve 5-45% fps in a sigle-adressed game.

Here are some driver notes from Nvidia proving my statement:

Patch 197.13 for the GTX 2xx series

o Up to 13% performance increase in Crysis: Warhead with a single
o Up to 30% performance increase in Crysis: Warhead with SLI
o Up to 13% performance increase in H.A.W.X with single GPU
o Up to 15% performance increase in H.A.W.X with SLI technology
o Up to 30% performance increase in Left 4 Dead with single GPU
o Up to 28% performance increase in Left 4 Dead with SLI technology

Now, i've seen patched with higher fps increase than those. Yet, ATi has already patched those (old games i know but..) you should get my point already.

Take a benched GTX 480 in a released game with 5% over 5870 and add 13-30% more (check the performance increase in Left 4 dead with a single GPU)
The key words in those drivers are "up to". Real world performance has never been more than a percentage point or two with drivers, excepting SLI optimizations.
 
Last edited:
The key words in those drivers are "up to". Real world performance has never been more than a percentage point or two with drivers, excepting SLI optimizations.

Wrong. I've noticed in almost every patch a significant fps increased from 5-25 in non-sli and up to 25fps++ in sli. FPS increase with sli is not a huge surprise. But i'll say 5-15 fps increase with a decent patch and 20+ with a good one.

If 480 is 5% over 5870 in a game, then 10 fps can suddenly become 15% and 20 can become 25%.

You should already know this. I'm surprised that you reject this, especially considering the fact that this is true. I, myself wouldn't bother posting this if it wasn't. And i have TWO gtx 275 so i know what i'm talking about.

Personal experience varies from written numbers. My personal experience is much closer to what's written compared to the numbers you're suggesting.
 
Wrong. I've noticed in almost every patch a significant fps increased from 5-25 in non-sli and up to 25fps++ in sli. FPS increase with sli is not a huge surprise. But i'll say 5-15 fps increase with a decent patch and 20+ with a good one.

If 480 is 5% over 5870 in a game, then 10 fps can suddenly become 15% and 20 can become 25%.

You should already know this. I'm surprised that you reject this, especially considering the fact that this is true. I, myself wouldn't bother posting this if it wasn't. And i have TWO gtx 275 so i know what i'm talking about.

Personal experience varies from written numbers. My personal experience is much closer to what's written compared to the numbers you're suggesting.
Just like this was true, right?
http://www.computerforum.com/172500...ease-performance-50-via-voltage-tweaking.html

I have run SLI setups in the past, i have setup SLI setups, and i have done benchmarks between driver versions. In fact, switching from 186.18 to 196.34 and both vantage, 3dm06, and 3dm05 scores went down.
 
Last edited:
Status
Not open for further replies.
Back
Top