Which GPU to get next?

Well we can not really tell you without a little more info. Are you gaming? Are you using distributed computing like SETI, FAH, or the Gutenburg Project? Do you need Graphics based Physics processing? Yes to the first is not important. If either of the other questions is yes, then you need a 560.

And sorry, but he was bashing AMD/ATI for issues that it is blatant that they did not cause.
 
Well we can not really tell you without a little more info. Are you gaming? Are you using distributed computing like SETI, FAH, or the Gutenburg Project? Do you need Graphics based Physics processing? Yes to the first is not important. If either of the other questions is yes, then you need a 560.

And sorry, but he was bashing AMD/ATI for issues that it is blatant that they did not cause.

No, sorry, but I had problems with the x1650 pro since I installed it. It would run games fine, but when a video would begin or end the monitor would flicker green. I ignored it since I don't watch many videos anyway. A few months later after replacing it with my old card (Another Radeon) when the card died, the old card dies along with the monitor. There's a difference between bashing a product like in a console war, when I never even used it, and talking about actual experiences I've had with a product. Posters here even admit Nvidia has better quality video cards anyway. AMD's are cheaper ,but you get what you pay for. And a video card CAN kill a monitor, the quickest way to find out would have been to force a refresh rate higher than the monitor is rated for, though most monitors have safety features for that and they will go black and say "out of range" or "out of frequency" so maybe something was glitchy with the refresh rate not being constant, killing my monitor, who knows. I did have problems with one game not giving me the correct refresh rate even when trying to force it. All of my uncle's Radeon cards have burned out as well, except for one which he gave to his brother, but that eventually died, anyway.
 
Last edited:
No, sorry, but I had problems with the x1650 pro since I installed it. It would run games fine, but when a video would begin or end the monitor would flicker green. I ignored it since I don't watch many videos anyway. A few months later after replacing it with my old card (Another Radeon) when the card died, the old card dies along with the monitor. There's a difference between bashing a product like in a console war, when I never even used it, and talking about actual experiences I've had with a product. Posters here even admit Nvidia has better quality video cards anyway. AMD's are cheaper ,but you get what you pay for. And a video card CAN kill a monitor, the quickest way to find out would have been to force a refresh rate higher than the monitor is rated for, though most monitors have safety features for that and they will go black and say "out of range" or "out of frequency" so maybe something was glitchy with the refresh rate not being constant, killing my monitor, who knows. I did have problems with one game not giving me the correct refresh rate even when trying to force it. All of my uncle's Radeon cards have burned out as well, except for one which he gave to his brother, but that eventually died, anyway.

If you set a refresh rate higher than the monitor can handle, then that is your own fault not the video card. Its also the monitors fault for being almost worthless in the first place if thats all it took to fry it. Iv run higher resolutions and refreshrates than my old 1993 ViewSonic could handle and it never hurt it at all. It would just be impossible to see anything.

I too had an X1650 pro 512MB AGP card. In fact I still have it. Never ran hot, and it never caused any problems. The only flaw the card had was Ubuntu didnt have drivers supporting it. Other than that it outperformed my 6800 Ultra by quite alot. The X1650 pro 512MB is a great card for AGP.

I dont think anyone here has said nVidia makes higher quality cards, because that is not true. Neither nVidia nor AMD or ATi make video cards at all. They design the GPU and PCB layout. Its up to the third party board makers to either follow their reference design or make their own. Either way, nVidia nor AMD make the card.

It sounds to me like all your experiences have been a user fault.
 
Radeon x1650 pro

I have one, my freind had it since it came out and had it overclocked it alot and had it running 50c idle and got up to 85c gaming for years, then I bought it and put it im my bros computer with a dell oem 250w psu and has been working great for 3 months. Its a sapphire, so you probally got a cheapo card or operator error. And that card is ATI, NOT AMD.


And get a 6850 or gtx560, your choice.
 
Been using ATI/AMD and Nvidia cards since the early 90s. Well mid 90s with Nvidia. As far as burning out. One company is no better then the other. Each company has had their duds of failure. ATI has actually been around abit longer then Nvidia. Nvidia really didnt get popular till after they bought out 3dfx. But one is no more likely to burn out then the other.
 
If you set a refresh rate higher than the monitor can handle, then that is your own fault not the video card. Its also the monitors fault for being almost worthless in the first place if thats all it took to fry it. Iv run higher resolutions and refreshrates than my old 1993 ViewSonic could handle and it never hurt it at all. It would just be impossible to see anything.

I too had an X1650 pro 512MB AGP card. In fact I still have it. Never ran hot, and it never caused any problems. The only flaw the card had was Ubuntu didnt have drivers supporting it. Other than that it outperformed my 6800 Ultra by quite alot. The X1650 pro 512MB is a great card for AGP.

I dont think anyone here has said nVidia makes higher quality cards, because that is not true. Neither nVidia nor AMD or ATi make video cards at all. They design the GPU and PCB layout. Its up to the third party board makers to either follow their reference design or make their own. Either way, nVidia nor AMD make the card.

It sounds to me like all your experiences have been a user fault.
I don't know where I said that I forced the wrong refresh rate. They sell ATI/AMD & Nvidia branded cards that are strictly the reference design and they handle all the RMA and everything for them, which is usually what I buy, not the 3rd party ones that try to overclock the cards and what not. Contracting the manufacturing of them is no different, they're still responsible for the product, so trying to defeat my experience by some little technical game doesn't change the fact I've had a terrible experience with that brand of cards. Doesn't matter if it's an AMD factory or some other company produces it for them, they designed it. I don't know how it could be user error, when nothing blew up while I was running with Nvidia. It's not that hard to pop a card out, put another one in, and plug in the pins for power.

I have one, my freind had it since it came out and had it overclocked it alot and had it running 50c idle and got up to 85c gaming for years, then I bought it and put it im my bros computer with a dell oem 250w psu and has been working great for 3 months. Its a sapphire, so you probally got a cheapo card or operator error. And that card is ATI, NOT AMD.


And get a 6850 or gtx560, your choice.

85C is ridiculous, mine only reached 79c and STILL burned out.
 
I don't know where I said that I forced the wrong refresh rate. They sell ATI/AMD & Nvidia branded cards that are strictly the reference design and they handle all the RMA and everything for them, which is usually what I buy, not the 3rd party ones that try to overclock the cards and what not. Contracting the manufacturing of them is no different, they're still responsible for the product, so trying to defeat my experience by some little technical game doesn't change the fact I've had a terrible experience with that brand of cards. Doesn't matter if it's an AMD factory or some other company produces it for them, they designed it. I don't know how it could be user error, when nothing blew up while I was running with Nvidia. It's not that hard to pop a card out, put another one in, and plug in the pins for power.



85C is ridiculous, mine only reached 79c and STILL burned out.

85c shouldnt hurt it. Graphics cards can handle 100c+ before they fry.

And if you could kindly point me in the direction of ATi/AMD/Nvidia graphics cards. Because they do not make them. They make the GPUs that is it. 3rd party companies make the cards.
 
It is definitely user error. It would not physically be able to burn at 79. Graphics cards are good north of 100*C. Mine hits 95 when the fan is down on stock and it still runs fine.

And just for the record before this gets locked because of the severe off topic, AMD is just as good of quality as Nvidia. Nvidia can do more things than AMD, but that does not make quality.
 
It is definitely user error. It would not physically be able to burn at 79. Graphics cards are good north of 100*C. Mine hits 95 when the fan is down on stock and it still runs fine.

And just for the record before this gets locked because of the severe off topic, AMD is just as good of quality as Nvidia. Nvidia can do more things than AMD, but that does not make quality.

That's stupidly hot. My Radeon 9600xt never went past 39C on load. I would like you to prove to me it's user error, as I never overclocked, underclocked, or used unofficial drivers for the radeon cards.
 
That's stupidly hot. My Radeon 9600xt never went past 39C on load. I would like you to prove to me it's user error, as I never overclocked, underclocked, or used unofficial drivers for the radeon cards.

Your talking about a graphics card that is about 10 years old....

You realize they ran alot cooler right? The GeForce 2 in 2000 didnt even have a heatsink. The AMD 386 processor didnt have a heatsink. As we add more transistors the heat increases.

We cant prove to you its a user error, because the user has a malfunction that wont allow us to input information.
 
Seriously...stop this. Its rediculous to argue because neither side will back down. Sure you dont like ati/amd cards, and thats fine and dandy. If they want to use that brand of card, go for it, its less expensive and is equvalent to the nvidia card, i honestly perfer nvidia for opengl support, but thats my 0.02. Anyway. If their playing games that dont require OpenGL, id grab the ati/amd, if they want some OpenGL performance, grab the nvidia variant.
 
I would like you to prove to me it's user error, as I never overclocked, underclocked, or used unofficial drivers for the radeon cards.
User error can come from a hell of a lot more than just the drivers and overclock. Even at stock. Current drivers for ATI HD 4870 and Nvidia GTX 480 alike set the fan speed for 100% load at between 5 and 15%. At that level with no user adjustment either from force editing the driver or using a utility such as afterburner to adjuct the fan speed it will kill the card over time.

Without a complete list of everything the card was used for, and the specific mode of failure, as well as the OS version and driver version we will never be able to prove where the user error occured.

I agree with the person above that neither side is going to give in, but I am also saying that with you it is impossible to find the error as the error is between the monitor and the chair. Calm down and listen. AMD is no worse than Nvidia. you have no proof other than some bad luck that they are. They are great cards for gaming. Above gaming Nvidia is the only choice. But whatever. The thread is going to get locked soon most likely.
 
But the fact is, nvidia nor ATi/AMD actually make their own cards for consumers. As it says, they design a reference card that they give to their 3rd party board makers who then do whatever they want with it. The reason I stay with EVGA is because they have the only reference model cards Iv seen.

ATI did at one time have there own branded name video cards. Not saying ATI made the card itself, but it was a ATI name branded card. But thats just like most card manufactures, most dont make the cards. Foxconn makes probably 75% of all video cards.
 
ATI and Nividia are fab less, they build nothing. They dont even make the GPU itself, much less anything else. I guess they could get the parts from other suppliers and put it together. But that would be a profit killing process.
 
Last edited:
There are TPVs of both of those, PNY makes some cards, for Nvidia, and i think Powercolor makes some of the ATI ones, however more than that i dont know.
 
Back
Top