GTX 480 receives OCC-Gold award for the best Single GPU

Status
Not open for further replies.

bomberboysk

Active Member
As measured by anandtech:
22207.png
 

CrayonMuncher

Active Member
not sure if it is worth the gold i read a review which showed the temp at 98 degrees which is crazy, but they said that the future release will improve on this
i agree with the statement

took too long, and not good enough.

i know it has massive parallel processing potential but it would be nice to have seen a couple of games/software at launch which could have utilised this potential
 

tlarkin

VIP Member
I just hate ATI's drivers though, they are so annoying and I hate the catalyst control center. I had so many issues with it over the years I switched back to Nvidia.

I guess since ATI started publishing better open source drivers I may give them another chance.
 

joh06937

New Member
I just hate ATI's drivers though, they are so annoying and I hate the catalyst control center. I had so many issues with it over the years I switched back to Nvidia.

I guess since ATI started publishing better open source drivers I may give them another chance.

the latest 10.3 drivers (at least the beta ones) are pretty good. some games get a nice boost in fps. i haven't had any problems with stability or anything like that. bezel management is cool. i never use ccc unless going from 2 desktops to eyefinity or visa versa.
edit: i see the official 10.3 drivers are released now.
 

tlarkin

VIP Member
the latest 10.3 drivers (at least the beta ones) are pretty good. some games get a nice boost in fps. i haven't had any problems with stability or anything like that. bezel management is cool. i never use ccc unless going from 2 desktops to eyefinity or visa versa.
edit: i see the official 10.3 drivers are released now.

I do multiple display and TV out, different story probably
 

Ryeong

New Member
As measured by anandtech:
22207.png

Can't be right. I have two 275's and it shows that GTX 260 is using over 300w alone and gtx 275 around 400 w!!. How can that be possible when my two 275's use around 550-600w togheter under full load?

Edit: i just noticed that it said "Total system consumption". So, now im not sure... I don't know my total system consumption. Is there any way to measure this without opening my case?
 
Last edited:

bomberboysk

Active Member
Can't be right. I have two 275's and it shows that GTX 260 is using over 300w alone and gtx 275 around 400 w!!. How can that be possible when my two 275's use around 550-600w togheter under full load?

Edit: i just noticed that it said "Total system consumption". So, now im not sure... I don't know my total system consumption. Is there any way to measure this without opening my case?

You gotta use a kill-a-watt.

By the way noob, that is power consumption under furmark, which is basically as high of a load as you can get on a gpu.
 

memory

Member
You can get one on ebay for 20 to 30 bucks. I have one myself and it comes in pretty handy.

I checked my system with it and it uses 300 watts while running prime95 and furmark. While gaming, it uses 240 watts.
 

Ryeong

New Member
You can get one on ebay for 20 to 30 bucks. I have one myself and it comes in pretty handy.

I checked my system with it and it uses 300 watts while running prime95 and furmark. While gaming, it uses 240 watts.

Thanks, i should get one of those. Not easy to calculate psu usage with random assumptions from other websites.
 

StrangleHold

Moderator
Staff member
All those facts have been known for some time. The shaders and the crap yield count on the wafers. They either released what they had or nothing for many more months.
 

mx344

New Member
Semi accurate is the worst website on the internet in terms of legit information.

Sorry, i couldn't read it. Here's what your link consists of:

milk.gif


Edit: They probably deleted it.. no wonder..This is "semi" accurate after all..

I can read it to, the date in you link is the 28th, and the one in strangle's is 29, not sure if that has anything to do w/ it but, its readable by me, lol...
 
Status
Not open for further replies.
Top