best card for 550$

Xijt

New Member
Just bought a MSI Radeon 270x from new egg and its havig fan issues. Called them and im getting a full refund, now my new price point for a card will be around 550 bucks. I have been looking at the EVGA GTX 780 for 530 bucks, and thats assuming its better then the card mentioned above. Im still new too this PC thing so excuse anything that is kinda a duh its better type question.

Im not prejudice to brands just as long as the reviews are good and its an upgrade from the previous card.
 

xxmorpheus

Member
290x is nice. I hate amd drivers though they are atrocious. Go with 780ti if you can afford it, and it is compatible with gsync. Nvidia's new tech is going to push the industry forward.
 

Xijt

New Member
The 290X and 780ti are both slightly above your budget, so yep the 780 is the way to go. There is this 290X that is $20 more than what you set your budget at.

http://www.amazon.com/Sapphire-PCI-...d=1388120239&sr=1-2-catcorr&keywords=AMD+290X

Been looking around and i think ima go with the 290x or atleast see how it does against the 270x. It HAS to be an upgrade, but then again i was amazed with the 270x, it was my first gaming video card.

290x is nice. I hate amd drivers though they are atrocious. Go with 780ti if you can afford it, and it is compatible with gsync. Nvidia's new tech is going to push the industry forward.
Idk what the nvidia gsync is, im a newbie with this whole pc thing. So far im hooked though -_-

In that price range. Either the R9 290 or the GTX 780.

Getting the 290x :good::good:
 

m3incorp

New Member
290x is nice. I hate amd drivers though they are atrocious. Go with 780ti if you can afford it, and it is compatible with gsync. Nvidia's new tech is going to push the industry forward.

AMD has made big strides in their drivers and there are not nearly as many complaints as there used to be a couple of years ago.
 

Xijt

New Member
Gsync basically gets rid a tearing while your playing games. Although you'll need a gysnc capable monitor in order to use it, which i dont believe is available atm.

For an excellent demonstration and explanation of gysnc watch this: http://www.youtube.com/watch?v=3PJjhBUSuHk

Idk seems a bitch much and tearing doesnt really bother me, i have seen it a few times playing bf4 but not much with the 270x. For my first card it seems to have been a great purchase.
AMD has made big strides in their drivers and there are not nearly as many complaints as there used to be a couple of years ago.
All the drivers i installed seemed fine. Only problem i had was with FRAPS and the recording sound. It worked fine with BF4 but not Black Ops 2. Checked all my settings and a few people said that i needed sound mixer enabled or something but my pc doesnt have sound mixer.
I also have no problems with there drivers.
Thus far im in love with the pc world, its a bit different but its a nice change from Xbox
 

xxmorpheus

Member
Gsync basically gets rid a tearing while your playing games. Although you'll need a gysnc capable monitor in order to use it, which i dont believe is available atm.

For an excellent demonstration and explanation of gysnc watch this: http://www.youtube.com/watch?v=3PJjhBUSuHk


It's available I just bought a monitor with gsync module. It does a lot more than eliminate tearing. It provides a smooth game play experience without necessitating a high end gpu keeping FPS pegged at 60 FPS+ on ultra settings. It also drastically reduces input lag
 

Okedokey

Well-Known Member
It's available I just bought a monitor with gsync module. It does a lot more than eliminate tearing. It provides a smooth game play experience without necessitating a high end gpu keeping FPS pegged at 60 FPS+ on ultra settings. It also drastically reduces input lag

Thats not actually how it works. It simply allows the monitor refresh rate to be dynamic and follow the output of the GPU, thereby removing tearing and the necessity to enable vsync which leads to division reductions in FPS.

As far as bang for the buck it's quite hard to beat the R9 290 or R9 290X in that price range.

Especially the 290s that are unlockable to 290x ;)
 

G80FTW

Active Member
Thats not actually how it works. It simply allows the monitor refresh rate to be dynamic and follow the output of the GPU, thereby removing tearing and the necessity to enable vsync which leads to division reductions in FPS.

How would this work? The GPU controlling the refresh rate of the monitor? Not sure thats possible. But if so, then I should get me one so my $200 vizio can display at 120hz.

I still experience tearing occasionally in any kind of vsync mode. I use adaptive on my 680 though.
 

Okedokey

Well-Known Member
How would this work? The GPU controlling the refresh rate of the monitor? Not sure thats possible. But if so, then I should get me one so my $200 vizio can display at 120hz.

I still experience tearing occasionally in any kind of vsync mode. I use adaptive on my 680 though.

I didn't say the gpu controls the monitor, i said the monitor follows the GPU, very different... read man!

Gsync works (as the name suggests) exactly as I say - it follows the GPU output dynamically. Thus removing the need for vsync and its associated disadvantages. Secondly, this technology is limited to the upper threshold of the monitor's refresh rate - so no, it wont run at 120Hz unless the monitor could display 120Hz previously...

I don't know the exact technology (haven't looked into it), but if you had a module (e.g. gsync) in the monitor that could detect (via software or hardware) GPU output FPS and match monitor refresh rates in Hz (which is fps essentially), that would remove tearing and make vsync obsolete... that is what nvidia has done. Im actually more surprised someone hasn't thought of this before!

Ill put it simply for you.

When the GPU outputs 20FPS, the monitor refreshes at 20Hz, when the GPU outputs 80FPS, the monitor refreshes at 80Hz and then everything in between up to the maximum that the monitor can handle. Whats the benefit? Well vsync would output divisions of FPS to ensure that tearing didn't happen in instances when the refresh rate and gpu output (fps) were out of phase... This meant that you could see drastic drops in the fps... no longer...

Metaphorically, its similar to machine gun synchronisation for forward mounted WW1 aircraft guns.... lol except in this metaphor, screen tearing = propeller destruction and FPS drops = propeller stopping... not ideal.

It's available I just bought a monitor with gsync module. It does a lot more than eliminate tearing. It provides a smooth game play experience without necessitating a high end gpu keeping FPS pegged at 60 FPS+ on ultra settings. It also drastically reduces input lag

You could've just turned on vsync with your system... Quad SLI 680s (GTX690 x 2), can run easily at 1080p anything without any issues resulting in lower FPS.... In fact you should be getting much better than 60FPS at all times no matter what you play.

Also mate, how is that scaling between 1 690 and 2 with the limited PCIe lanes on the 2700K?? Also, that PSU may need upgrading with essentially 4 680s installed... just warning you. ;)
 
Last edited:

xxmorpheus

Member
two 690's dont come anywhere close to using 1250w on psu, maxed out power consumption is a paltry 682 watts. http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-12.html

I can pretty much throw anything at 1080p and max it out, but input lag and micro stuttering are horrible with vsync on. Where most people wont notice, i do. I have a 1440p monitor and im essentially downgrading because i hate micro stutter and input lag so much. its especially terrible in FPS. In terms of scaling, the cards are always locked to 60 fps anyway so the scaling is only going to be in accordance to the refresh rate, unless im running some crazy graphically intensive application like crysis 3. I just keep it that way to control temps and not have everything running at 9 gazillion fps and baking me alive in my room lol...

okeydokey, would you happen to know if i can use another gtx 680 as a dedicated physx card even with two 690?
 
Last edited:

Okedokey

Well-Known Member
two 690's dont come anywhere close to using 1250w on psu, maxed out power consumption is a paltry 682 watts. http://www.tomshardware.com/reviews/radeon-hd-7990-devil13-7970-x2,3329-12.html


okeydokey, would you happen to know if i can use another gtx 680 as a dedicated physx card even with two 690?

Firstly the 682W is only the cards, and not the CPU (add 200W with oc) and the rest of your system. Trust me, you're borderline, and if you don't believe me measure it.

Yes, you can use a dedicated 680 with the quad sli, however your PSU is nowhere near powerful enough.
 

G80FTW

Active Member
I didn't say the gpu controls the monitor, i said the monitor follows the GPU, very different... read man!

Gsync works (as the name suggests) exactly as I say - it follows the GPU output dynamically. Thus removing the need for vsync and its associated disadvantages. Secondly, this technology is limited to the upper threshold of the monitor's refresh rate - so no, it wont run at 120Hz unless the monitor could display 120Hz previously...


So what your saying is the GPU dictates or "controls" the refresh rate of the monitor? How is that very different? If the monitor is "following" the graphics card then the graphics card is "leading". Therefore, is the graphics card not controlling the refresh of the monitor in some respect?
 

Okedokey

Well-Known Member
So what your saying is the GPU dictates or "controls" the refresh rate of the monitor? How is that very different? If the monitor is "following" the graphics card then the graphics card is "leading". Therefore, is the graphics card not controlling the refresh of the monitor in some respect?


You're wrong as I indicated because the G-sync hardware is the control circuitry. The GPU has no idea that gsync is enabled beyond the driver.
 
Top