2x SLI NVIDIA 670MX 3GB vs NVIDIA 680 4gb

Nice copy and paste from Extremetech. But it's not really what causes microstutter.

The reason behind it is that the GPUs render alternate frames, but some frames take longer to draw than others (explosions etc). This means that one of the GPUs has to wait before it can send its data in order to maintain synchronization. Microstuttering occurs when the length of time this takes falls outside of the tolerances required for smooth gameplay. In a perfect world at 60 fps, a frame has to be rendered every 16.66ms, but realistically this never happens. Frame delivery time often ranges from 8ms to 25ms. This is "tolerable" just so long as the larger frame times don't occur too often, which is when you get microstutter.

That said, I've been running SLI on top end cards all my life and never experienced any problems. Nvidia's drivers have always been superb. Before I had my two 680's, I was running tri SLI 580's and before that tri SLI 480's. These setups have always given me stellar performance @ 2560x1600.



2560x1440, 2560x1600, 4K screens just around the corner.

Yes, 780's and 880's are also just around the corner, your point?
You have just explained what micrstutter is, not what causes it. Yes we all know microstutter is cause by frames being rendered at differently spaced intervals, but what causes this? I researched and found that nobody seems to be 100% sure what is causes it, but a lot come to the conclusion in my paste (i try to avoid typing if possible :P). By the way it is already proven for a fact that ALL single and multi gpu setups suffer from microstutter to a degree, its just worse on multi gpu setups. From what ive read on a lot of different forums it seems usually that people either dont see it at all or see it all the time like me. Like i have said before, some people seem to be more sensitive to microstutter than others. Its possible that if i was to play on your setup i could see the microstutter and you couldn't, unless it got bad enought for you to see. Also ive have had one or two games that I couldnt see microstutter in when i had my sli setups, the microstutter was still there, just not severe enought for me to detect it.
 
Last edited:
Yes, 780's and 880's are also just around the corner, your point?
You have just explained what micrstutter is, not what causes it. Yes we all know microstutter is cause by frames being rendered at differently spaced intervals, but what causes this? I researched and found that nobody seems to be 100% sure what is causes it, but a lot come to the conclusion in my paste (i try to avoid typing if possible :P). By the way it is already proven for a fact that ALL single and multi gpu setups suffer from microstutter to a degree, its just worse on multi gpu setups. From what ive read on a lot of different forums it seems usually that people either dont see it at all or see it all the time like me. Like i have said before, some people seem to be more sensitive to microstutter than others. Its possible that if i was to play on your setup i could see the microstutter and you couldn't, unless it got bad enought for you to see. Also ive have had one or two games that I couldnt see microstutter in when i had my sli setups, the microstutter was still there, just not severe enought for me to detect it.

My point is that your statement about 1920x1080 being the highest resolution monitors can do is wrong and has been wrong for near enough the last decade.

I don't think you've read my post properly, because I clearly did explain what causes microstutter. It's obvious in your initial post that you have just entered "Microstutter" into google and paraphrased a section from the first result that it returns.

The PCI bus speed only comes into play when you have to run one (or more) of the cards in a x4 slot. However as stuttering occurs on systems with motherboards that can run all cards at x16 or x8, this clearly does not explain cause of the problem. As I have stated before, it is to do with varying framerate draw times and the drivers ability to maintain synchronization between cards during lapses. If one of the GPU's has to wait for one or more of the other cards to draw a frame before it can send it's own data, this will obviously create a discrepancy if it falls outside of the tolerable 25ms limit. Since there is more data to be rendered in some frames over others, this variation will always exist.

SLI is still the answer for buttery smooth gameplay, you just need two whopping cards rather than two mediocre ones. This creates enough headroom in terms of framerate for the stuttering to become a non issue, since the increase in number of frames being drawn is directly related to draw time.

I'm perplexed as to why you feel you would be more discerning with regard to visual performance than any other enthusiast on a forum such as this.
 
Last edited:
My point is that your statement about 1920x1080 being the highest resolution monitors can do is wrong and has been wrong for near enough the last decade.

I don't think you've read my post properly, because I clearly did explain what causes microstutter. It's obvious in your initial post that you have just entered "Microstutter" into google and paraphrased a section from the first result that it returns.

The PCI bus speed only comes into play when you have to run one (or more) of the cards in a x4 slot. However as stuttering occurs on systems with motherboards that can run all cards at x16 or x8, this clearly does not explain cause of the problem. As I have stated before, it is to do with varying framerate draw times and the drivers ability to maintain synchronization between cards during lapses. If one of the GPU's has to wait for one or more of the other cards to draw a frame before it can send it's own data, this will obviously create a discrepancy if it fall outside of the tolerable 25ms limit. Since there is more data to be rendered in some frames over others, this variation will always exist.

SLI is still the answer for buttery smooth gameplay, you just need two whopping cards rather than two mediocre ones. This creates enough headroom in terms of framerate for the stuttering to become a non issue, since the increase in number of frames being drawn is directly related to draw time.

I'm perplexed as to why you feel you would be more discerning with regard to visual performance than any other enthusiast on a forum such as this.

What i said was (sorry if this paste offends you): 'my 680 certainly can, it can also handle 1920x1080, which is the highest more or less all monitors can do at the moment', MORE OR LESS DOES NOT MEAN ALL, there are probably monitors that exist that can do more than 2560 x 1600, i dunno and you have a go at me for not reading Oo. Yes i put microstutter in google, as i have done many time before in the past, as its the is the best resource i have to research with, i could try writing microstutter into the the soil in my garden, but at a guess, this wouldnt be so fruitful. Wether or not you agree with the cause of microstutter doesnt matter, it still exists. As for me supposedly being more discerning with regard to visual performance than any other enthusiast on a forum, i find it hard to believe you have spoken to everybody on this forum regarding this and for anyone reading this, i ask you google around a bit and you will find a whole load of people whom agree with me. Why so defensive? im just trying to get to the truth. Ive tried to meet you half way, i believe you cant see it, I just suggesting that some can and some can't see microstutter, so maybe im in the minority, but its proven to be there and I and others i know in real life CAN DAMN WELL SEE IT, so dont suggest i can't.
 
What i said was (sorry if this paste offends you): 'my 680 certainly can, it can also handle 1920x1080, which is the highest more or less all monitors can do at the moment', MORE OR LESS DOES NOT MEAN ALL, there are probably monitors that exist that can do more than 2560 x 1600, i dunno and you have a go at me for not reading Oo.

Don't worry I'm not easily offended, but it seems we're going to have to agree to disagree that there are (and have been for many years) vast numbers of people using resolutions above 1080p.

Yes i put microstutter in google, as i have done many time before in the past, as its the is the best resource i have to research with, i could try writing microstutter into the the soil in my garden, but at a guess, this wouldnt be so fruitful.

I have no issue with people doing research via google. What I take issue with is the fact that you just took the very first result that it comes back with when typing "microstutter", and then proceeded to paraphrase the article without really understanding the topic.

As for me supposedly being more discerning with regard to visual performance than any other enthusiast on a forum, i find it hard to believe you have spoken to everybody on this forum regarding this and for anyone reading this, i ask you google around a bit and you will find a whole load of people whom agree with me. Why so defensive? im just trying to get to the truth. Ive tried to meet you half way, i believe you cant see it, I just suggesting that some can and some can't see microstutter, so maybe im in the minority, but its proven to be there and I and others i know in real life CAN DAMN WELL SEE IT, so dont suggest i can't.

My point was that you are assuming your brain is more capable of detecting microstutter than others, which is an odd assumption to make on a computer enthusiasts forum.

I'm not defensive, I've just attempted to cover the issue adequately for the benefit of other forum members. It is you that have thrown your toys out of the pram with poorly puntcuated, irrational posts.

As I explained before, of course microstutter exists, but it can be mitigated by using multiple high end cards as the added FPS acts as a buffer or headroom. I would question the amount of experience you really have with high end SLI configs based on the nature of your previous posts.
 
Last edited:
I think you three or four fell off the train a few miles back, might want to get back on. Question is which card would do the OP better, not a rant on technology.
 
Back on track (sorry for the pun Fury), I would recommend the 680m. Not because I have an issue with SLI or anything. Just as a laptop user, I prefer battery life to the few more frames you may get from the SLI setup. On top of that, the 680m has a higher bandwidth also, and is only 576 cores behind the SLI setup.
 
I completely understand the effects of microstutter, it is a very simple concept to grasp. Microstutter is the name given to the symptoms, not the cause. The causes of microstutter are debatable and as far as ive seen nobody is 100% sure what causes it. All i did was paste in the general consensus, intead of putting it into my own words, it was just faster than typing it out.

'It is you that have thrown your toys out of the pram with poorly puntcuated, irrational posts', this is completely unnecessary and only serves to strike out against me. Im sorry for poor punctuation, i wasnt really overly concerned about my punctuation and to honest has never my strong point. I was always bad at english, but great at math :P. As for irrational, as far as i can see, im making complete sense, tho an irrational person would say that.

'I would question the amount of experience you really have with high end SLI configs based on the nature of your previous posts' here you are just insinuating that i am a liar, which i am not. Of course i can't prove im not lying, just like you can't prove that i am. I ask anyone who is considering a multi gpu setup to do there homework and if possible play on a computer with multi gpu's, so that they know wether or not microstutter could be an issue for them. This last sentence a least, i think is good advice to the OP.
 
Last edited:
Due to going wildly off topic I will keep my final response as brief as possible.


I completely understand the effects of microstutter, it is a very simple concept to grasp. Microstutter is the name given to the symptoms, not the cause. The causes of microstutter are debatable and as far as ive seen nobody is 100% sure what causes it.

No, we have a good understanding of what causes microstutter as I have outlined in my previous posts.

Taken from Wikipedia....

"As of May 2012, with the latest release of hardware and drivers from nVidia and AMD, AMD's Radeon HD 7000 series is severely more affected by micro stuttering than nVidia's GeForce 600 Series. In tests performed in Battlefield 3, a configuration with two GeForce GTX 680 in SLi-mode showed a 7% variation in frame delays, compared to 5% for a single GTX 680, indicating virtually no micro stuttering at all."


'It is you that have thrown your toys out of the pram with poorly puntcuated, irrational posts', this is completely unnecessary and only serves to strike out against me. Im sorry for poor punctuation, i wasnt really overly concerned about my punctuation and to honest has never my strong point. I was always bad at english, but great at math :P. As for irrational, as far as i can see, im making complete sense, tho an irrational person would say that.

Ok then.

'I would question the amount of experience you really have with high end SLI configs based on the nature of your previous posts' here you are just insinuating that i am a liar, which i am not. Of course i can't prove im not lying, just like you can't prove that i am. I ask anyone who is considering a multi gpu setup to do there homework and if possible play on a computer with multi gpu's, so that they know wether or not microstutter could be an issue for them. This last sentence a least, i think is good advice to the OP.

Ok then.

And to the OP, go for the 680m.
 
Last edited:
I would question the amount of experience you really have with high end SLI configs based on the nature of your previous posts'

His current and previous rig that I know of.

None of the setups you have listed as having owned are anywhere near high end.
 
His current and previous rig that I know of.

None of the setups you have listed as having owned are anywhere near high end.

8800gt, 5870, gtx 275 not considered high end when they first came out Oo. These are just some of the ones i owned.
 
Certainly not. The 8800GT was mid range, as were the others. Spesh's experience is closer to tri-SLI 8800GTX for that time. As is mine. Never had issues with microstutter.

But I think someone nailed it before. If there is as appears very little posts on Computer Forum about microstutter, it may not be a real issue.
 
Certainly not. The 8800GT was mid range, as were the others. Spesh's experience is closer to tri-SLI 8800GTX for that time. As is mine. Never had issues with microstutter.

But I think someone nailed it before. If there is as appears very little posts on Computer Forum about microstutter, it may not be a real issue.

Seriously, not high end cards :confused:. By your reasoning anything now below a GTX titan is a mid range card. So what if there one or two cards out there that a little better, it doesn't mean its suddenly a mid range card. These two cards are very close perfomance wise and the 8800gt sometimes even comes out on top.
Just 1 example: http://www.anandtech.com/show/2365/9

I think your statment is so ridiculous, theres no need for me to continue onto the other cards. I think the truth is clear for all to see with a bit of googling and comparing with other cards out at the time.
 
Dude, i love the 8800GT. I had several. The 8800GT was the best bang for buck at the time. But it wasn't high end.

photo.jpg
 
I would have believed you have had a 8800gt, if you told me. They were pretty damn expensive when they first came out, of course not near as expensive as a 8800gtx. I still think it was a high end card, just like the 275, which was only second to the 285 at the time.
 
The 275 was certainly mid range, as was the 5870. If you were running SLI/CF on those cards, then I'm not surprised you had issues.

Also remember that multi GPU was a relatively immature technology back then. It has come a long way since.
 
I take it then my 680 is mid range too then and the titan is high end. So my 680 being mid range would suck in sli then.
 
I take it then my 680 is mid range too then and the titan is high end. So my 680 being mid range would suck in sli then.

No, your 680 is obviously a high end card. The Titan is an anomaly, in that there has not really been anything like it in the GTX range before. It is extremely specialist and costs the same as two 680's. I'm talking resolutions above 2560x1600 or multi monitor, hence the massive 6GB VRAM buffer.

The 275 only had the 260 below it. It was trumped by the 280, 285 and 295.
 
Last edited:
No, your 680 is obviously a high end card. The Titan is an anomaly, in that there has not really been anything like it in the GTX range before. It is extremely specialist and costs the same as two 680's. I'm talking resolutions above 2560x1600 or multi monitor, hence the massive 6GB VRAM buffer.

The 275 only had the 260 below it. It was trumped by the 280, 285 and 295.

This is incorrect. The 275 was basically as powerful than the 280, dont believe me check it out. When i bought the 275, the 295 didnt even exist to be able to trump it, tho it did come soon after. The only card really a tiny bit better at the time was the 285, usually beating it by 2 or 3 frames.
http://www.xbitlabs.com/articles/graphics/display/evga-gf-gtx275-1792mb_6.html#sect1
Here is the 275, classed as a high range card by nvidia:
http://en.wikipedia.org/wiki/GeForce_200_Series
 
In fact i bought these two 275's straight from the distributor. It turn out looking at the serial numbers, which were identical except for the last number, that these two cards were made seperated by only one card made between them and they were basically brothers :P. This I think would be the best situation for sli, cards which are as identical as possible, but they microstuttered like crazy :(
 
This is incorrect. The 275 was basically as powerful than the 280, dont believe me check it out. When i bought the 275, the 295 didnt even exist to be able to trump it, tho it did come soon after. The only card really a tiny bit better at the time was the 285, usually beating it by 2 or 3 frames.

The 280 had more memory, larger memory interface, and more ROP units. Hence it was better at higher resolutions with the eye candy turned up.

In fact i bought these two 275's straight from the distributor. It turn out looking at the serial numbers, which were identical except for the last number, that these two cards were made seperated by only one card made between them and they were basically brothers :P. This I think would be the best situation for sli, cards which are as identical as possible, but they microstuttered like crazy :(

Having cards from the same batch would only affect overclocking potential. It would have no impact on SLI performance.

I had a 295, which also stuttered (as I said it was relatively early days for SLI). Stuttering is all but non existent in high end 600 series cards.
 
Last edited:
Back
Top