Shlouski
VIP Member
Nice copy and paste from Extremetech. But it's not really what causes microstutter.
The reason behind it is that the GPUs render alternate frames, but some frames take longer to draw than others (explosions etc). This means that one of the GPUs has to wait before it can send its data in order to maintain synchronization. Microstuttering occurs when the length of time this takes falls outside of the tolerances required for smooth gameplay. In a perfect world at 60 fps, a frame has to be rendered every 16.66ms, but realistically this never happens. Frame delivery time often ranges from 8ms to 25ms. This is "tolerable" just so long as the larger frame times don't occur too often, which is when you get microstutter.
That said, I've been running SLI on top end cards all my life and never experienced any problems. Nvidia's drivers have always been superb. Before I had my two 680's, I was running tri SLI 580's and before that tri SLI 480's. These setups have always given me stellar performance @ 2560x1600.
2560x1440, 2560x1600, 4K screens just around the corner.
Yes, 780's and 880's are also just around the corner, your point?
You have just explained what micrstutter is, not what causes it. Yes we all know microstutter is cause by frames being rendered at differently spaced intervals, but what causes this? I researched and found that nobody seems to be 100% sure what is causes it, but a lot come to the conclusion in my paste (i try to avoid typing if possible
Last edited: