Well we can not really tell you without a little more info. Are you gaming? Are you using distributed computing like SETI, FAH, or the Gutenburg Project? Do you need Graphics based Physics processing? Yes to the first is not important. If either of the other questions is yes, then you need a 560.
And sorry, but he was bashing AMD/ATI for issues that it is blatant that they did not cause.
No, sorry, but I had problems with the x1650 pro since I installed it. It would run games fine, but when a video would begin or end the monitor would flicker green. I ignored it since I don't watch many videos anyway. A few months later after replacing it with my old card (Another Radeon) when the card died, the old card dies along with the monitor. There's a difference between bashing a product like in a console war, when I never even used it, and talking about actual experiences I've had with a product. Posters here even admit Nvidia has better quality video cards anyway. AMD's are cheaper ,but you get what you pay for. And a video card CAN kill a monitor, the quickest way to find out would have been to force a refresh rate higher than the monitor is rated for, though most monitors have safety features for that and they will go black and say "out of range" or "out of frequency" so maybe something was glitchy with the refresh rate not being constant, killing my monitor, who knows. I did have problems with one game not giving me the correct refresh rate even when trying to force it. All of my uncle's Radeon cards have burned out as well, except for one which he gave to his brother, but that eventually died, anyway.
Radeon x1650 pro
I don't know where I said that I forced the wrong refresh rate. They sell ATI/AMD & Nvidia branded cards that are strictly the reference design and they handle all the RMA and everything for them, which is usually what I buy, not the 3rd party ones that try to overclock the cards and what not. Contracting the manufacturing of them is no different, they're still responsible for the product, so trying to defeat my experience by some little technical game doesn't change the fact I've had a terrible experience with that brand of cards. Doesn't matter if it's an AMD factory or some other company produces it for them, they designed it. I don't know how it could be user error, when nothing blew up while I was running with Nvidia. It's not that hard to pop a card out, put another one in, and plug in the pins for power.If you set a refresh rate higher than the monitor can handle, then that is your own fault not the video card. Its also the monitors fault for being almost worthless in the first place if thats all it took to fry it. Iv run higher resolutions and refreshrates than my old 1993 ViewSonic could handle and it never hurt it at all. It would just be impossible to see anything.
I too had an X1650 pro 512MB AGP card. In fact I still have it. Never ran hot, and it never caused any problems. The only flaw the card had was Ubuntu didnt have drivers supporting it. Other than that it outperformed my 6800 Ultra by quite alot. The X1650 pro 512MB is a great card for AGP.
I dont think anyone here has said nVidia makes higher quality cards, because that is not true. Neither nVidia nor AMD or ATi make video cards at all. They design the GPU and PCB layout. Its up to the third party board makers to either follow their reference design or make their own. Either way, nVidia nor AMD make the card.
It sounds to me like all your experiences have been a user fault.
I have one, my freind had it since it came out and had it overclocked it alot and had it running 50c idle and got up to 85c gaming for years, then I bought it and put it im my bros computer with a dell oem 250w psu and has been working great for 3 months. Its a sapphire, so you probally got a cheapo card or operator error. And that card is ATI, NOT AMD.
And get a 6850 or gtx560, your choice.
I don't know where I said that I forced the wrong refresh rate. They sell ATI/AMD & Nvidia branded cards that are strictly the reference design and they handle all the RMA and everything for them, which is usually what I buy, not the 3rd party ones that try to overclock the cards and what not. Contracting the manufacturing of them is no different, they're still responsible for the product, so trying to defeat my experience by some little technical game doesn't change the fact I've had a terrible experience with that brand of cards. Doesn't matter if it's an AMD factory or some other company produces it for them, they designed it. I don't know how it could be user error, when nothing blew up while I was running with Nvidia. It's not that hard to pop a card out, put another one in, and plug in the pins for power.
85C is ridiculous, mine only reached 79c and STILL burned out.
It is definitely user error. It would not physically be able to burn at 79. Graphics cards are good north of 100*C. Mine hits 95 when the fan is down on stock and it still runs fine.
And just for the record before this gets locked because of the severe off topic, AMD is just as good of quality as Nvidia. Nvidia can do more things than AMD, but that does not make quality.
That's stupidly hot. My Radeon 9600xt never went past 39C on load. I would like you to prove to me it's user error, as I never overclocked, underclocked, or used unofficial drivers for the radeon cards.
User error can come from a hell of a lot more than just the drivers and overclock. Even at stock. Current drivers for ATI HD 4870 and Nvidia GTX 480 alike set the fan speed for 100% load at between 5 and 15%. At that level with no user adjustment either from force editing the driver or using a utility such as afterburner to adjuct the fan speed it will kill the card over time.I would like you to prove to me it's user error, as I never overclocked, underclocked, or used unofficial drivers for the radeon cards.
Since its already off the rails...i want bets on how quick it gets locked...seems more worth my time <![]()
But the fact is, nvidia nor ATi/AMD actually make their own cards for consumers. As it says, they design a reference card that they give to their 3rd party board makers who then do whatever they want with it. The reason I stay with EVGA is because they have the only reference model cards Iv seen.