Gigabyte! Damn!

bomberboysk

Active Member
I was talking about the big flat apparatus with the 9 on it...

Thats just to cool the nf200 chips as well as the southbridge, its just more or less a large aluminum heatsink with heatpipes that go to the northbridge, which you can then use with the waterblock or large air cooling apparatus. If you look at the UD7 you could understand it better:

http://www.xtremesystems.org/forums/showthread.php?t=240504

Really glad they put the two eps 8pin connectors on the board, thats one of the ud7's biggest flaws for benching.
 

funkysnair

VIP Member
Enough with the looks for gods sake, the UD9 is basically a benching board, looks are not the most important thing to benchers...

so why colour it baby blue if there is no point in the looks?

they have gone to the effort in colouring it why not choose a decent colour?

basicly some child like minded person decided to colour a mobo like crap end of-good board but looks like a$$e

i wouldnt stick that in my case, would rather go EVGA
 

CrayonMuncher

Active Member
why is it on the link that cpuz shows the cpu clocked at 5760 mhz but cinebench and 3dmark show it clocked at 4500 mhz ?
is there something im missing?

i personally dont care that much about the colour and very happily use in my system, i could be pink for all i care but when it performs like that it doesnt really matter to me
 
Last edited:

linkin

VIP Member
Anyways, if I have this correctly: if that board theoretically did 4-way SLI with 4 GTX 480s, that would have a peak power of 1800 watts from the cards alone. I don't even see a PS on newegg above 1500 watts.

You do realise that 450 watts is for the entire system and not just the card right? benchmarks are decieving ;)

They should do power tests with no/integrated GPU to determine base power usage. Then add the card and see what the idle/load draw goes up to and then you can calculate what the draw for the actual card is. For example, let's take this graph:

Power%20Consumption.png


And take 450 (load consumption of entire system with the 480) and take away 280 (load consumption of system with the 5850) and then you get a basic idea of the draw of the card. in this case it's 170 watts for the card alone. So then, you take 450 + (3 x 170 ) and it = 960 watts draw with 4x GTX 480's

Let's use the same calculation for idle usage. 146 - 114 = 32 so 146 + (3 x 32) = 242 watts idle draw for 4x GTX 480's

If the GTX 4xx series drew over 300 watts they would not be allowed to sell them due to some regulation or something. it has something to do with the PCI-E spec i believe. not sure. :confused:

Just thought i'd get that off my chest lol. and the calculations aren't accurate because we don't know the usage of the system without a GPU or using an integrated (if any.)
 
Last edited:

Intel_man

VIP Member
You do realise that 450 watts is for the entire system and not just the card right? benchmarks are decieving ;)

They should do power tests with no/integrated GPU to determine base power usage. Then add the card and see what the idle/load draw goes up to and then you can calculate what the draw for the actual card is. For example, let's take this graph:

Power%20Consumption.png


And take 450 (load consumption of entire system with the 480) and take away 280 (load consumption of system with the 5850) and then you get a basic idea of the draw of the card. in this case it's 170 watts for the card alone. So then, you take 450 + (3 x 170 ) and it = 960 watts draw with 4x GTX 480's

Let's use the same calculation for idle usage. 146 - 114 = 32 so 146 + (3 x 32) = 242 watts idle draw for 4x GTX 480's

If the GTX 4xx series drew over 300 watts they would not be allowed to sell them due to some regulation or something. it has something to do with the PCI-E spec i believe. not sure. :confused:


Just thought i'd get that off my chest lol. and the calculations aren't accurate because we don't know the usage of the system without a GPU or using an integrated (if any.)
The speculated GTX 490 is expected to have a 350W TDP.
 

linkin

VIP Member
Maybe it will we PCI-E 3.0 then? I believe that 3.0 is meant to raise that power loevel limit or whatever.
 
Top