How does ATI numbering work?

Aastii

VIP Member
I understand nvidia, first number=generation, numer after that=how good, higher=better, letters after, eg gt, =better than the average model, so 8800gt is better than 8800 which is better than 8600, but none can be compared with a 9800 as they are diferent generations.

ATI has always confused me though, i don't even know where to start, so could someone help me out here please
 
They are Generations too... But, Im only going to start at the 1xxx series, then up

Here you go:

1xxx series
x1050pro < x1550 pro < x1650pro < x1650XT < X1950 pro < X1950GT< X1950XT


2xxx Series
HD 2400PRO < HD 2400XT < HD 2600PRO < HD 2600XT < HD 2900 PRO < HD 2900GT < HD 2900XT

3xxx Series
HD 3450 < HD 3650 < HD 3850 < HD 3870 < HD 3870x2

4xxx Series
HD 4550 < HD 4650 < HD 4670 < HD 4830 < HD 4850 < HD 4870 < HD 4850x2 < HD 4870x2

For Example, first card in each series are comparable. But then later on cards, thats where the comparability is all messed up.

EX: A x1950xt IS NOT comparable to a HD 4870.
 
I understand nvidia, first number=generation, numer after that=how good, higher=better, letters after, eg gt, =better than the average model, so 8800gt is better than 8800 which is better than 8600, but none can be compared with a 9800 as they are diferent generations.
Well, actually... 8800GT and 9800GT are the same chip, the card just has a newer BIOS revision and support for 3-way sli. Performance-wise, they're essentially the same; it's a rebranded card.

EDIT: The naming shceme was changed when jumping from second to third generation; in current cards, 90 is the equivalent of XTX seen in previous generations, 70 is XT, 50 is Pro and 30 is presumably the "no suffix" or SE/LE/GT/GTO card. So while you really can't compare different generations, you could say the 2600XT is a "HD2670".
 
Last edited:
what is HD all about? and x2?

Also, which there is the best, and which the worst?

HD I think is for people who have no idea what they are doing, basically a marketing scheme (Don't quote me though.) It might also just let consumers be aware that it is HD compatible. X2 is two cores put onto one PCB or, Printed Circuit Board. It allows for pretty much double the graphical performance of a single card (Ex. 4870 = 100%, 4870 X2 = 180%.) The best card in that list is the HD 4870 X2, and the worst is the x1050.

EDIT: There was also supposed to be a HD 2900 XTX, which was supposed to be the holy grail of video cards but they never released it. Some people speculate that ATi is going to release a HD 49XX series to compete with the GTX 285 and 295.
 
Last edited:
Well, actually... 8800GT and 9800GT are the same chip, the card just has a newer BIOS revision and support for 3-way sli. Performance-wise, they're essentially the same; it's a rebranded card.
True to a point. Most 9800GT are just a bios flashed 8800GT. But a (true) 9800GT has a 55nm. GPU and dual SLI bridge. Most 9800GT now still have the 65nm. core and only one SLI bridge. Think Asus and (maybe) some others are using the 55nm. with Dual SLI bridges.
 
True to a point. Most 9800GT are just a bios flashed 8800GT. But a (true) 9800GT has a 55nm. GPU and dual SLI bridge. Most 9800GT now still have the 65nm. core and only one SLI bridge. Think Asus and (maybe) some others are using the 55nm. with Dual SLI bridges.
I though 9800GTX was the 55nm shrink... oh, well.

So... is the 9800GTX the 55nm shrink of 8800GTX? If so, what's the 9800GTX+... is it just factory OCed 9800GTX?
 
As far as I remember the 8800GTX has the 80 core with 384 bit memory. The 9800GTX has the 92 core with 256 bit memory and the 9800GTX+ has the 94/92b(dont remember) core.
 
Last edited:
I get the basic numberings. I mean, it's pretty easy to tell that one card is at least somewhat better from the numbering scheme.

Whenever I need to choose a graphics card though, I usually get on Tom's Hardware and look at the benchmarks. Am I flawed in doing this? Should I instead start looking at '384-bit vs 256-bit', and the other technical specs?
 
I get the basic numberings. I mean, it's pretty easy to tell that one card is at least somewhat better from the numbering scheme.

Whenever I need to choose a graphics card though, I usually get on Tom's Hardware and look at the benchmarks. Am I flawed in doing this? Should I instead start looking at '384-bit vs 256-bit', and the other technical specs?

If a card with 256 outperforms one with 384, or vice versa, then i don't think it really matters about technology if one is better than the other, even if the better one is the older technology, which is probably unlikely. The only time it could be better to get the not-quite-so-good one is if the better one is alot less economical than the other and uses crap loads more power
 
It's all crap to confuse people. Seriously models should be called A-Z and ranking of a specific model should be 1-10. So current best model would be M-10 or something. None of this XFXGTXHD9800XT4850X2FX. All these X's make me think theres a buried treasure inside the heatsink.
 
Last edited:
Back
Top