Bottlenecks, or how I learned to stop worrying and just buy the gpu.

Ihatethedukes

New Member
WARNING: THERE CAN AND WILL BE EXCEPTIONS TO THIS RULE. ALL RULES HAVE EXCEPTIONS IN ONE PLACE OR ANOTHER. PLEASE DO NOT POST ANOTHER 'HEY, IN MY SMALL NICHE AND HIGHLY EXCEPTIONAL CASE! YOUR GENERAL RULE IS WRONG!' YOU HAVE ALL BEEN RESPECTFUL SO FAR, AND I THANK YOU. THIS IS ONLY A PRECAUTION


I hear the cry of "NO NOES BOTTLENECKZ!" all too often her and over on OCN and have decided to give a little more graphical of an approach to the explanation of why you should NOT buy a new faster CPU and instead buy a bigger monitor.

What IS a bottleneck? A condition in a system where one component is slower than the rest at its given task and causes the other components to sit around and 'wait' for it so that it can continue it's work.

A perfect example of this is a steel mill. The raw steel components of iron and whatnot are brought in by truckers. The workers then melt it down, pour it, press it into long flat sheets, roll it up and ship it out on trucks. A bottleneck in this system would come from truckers unable to ship in raw materials as fast as the workers use it to make steel. The reason why is obvious, the workers cannot make rolled steel if they have no steel to roll. So they are stuck waiting around for a truck to come in and then work at their task.

Your computer playing a video game is the same way. If your CPU is markedly slower than your GPU, it will bottleneck it down to the maximum speed that your CPU can render AI/physics/etc. It doesn't matter if your GPU can output 1000FPS your CPU can only execute 100FPS worth of instructions, so your GPU will be waiting around for your CPU to finish his job (the equivalent to a truck coming in) and then the GPU will quickly process the graphical data needed and will wait around for another batch (truck) to come in.

The same goes in the opposite direction, a GPU only able to put out 100FPS and a CPU that can output 1000FPS will still only output 100FPS for the same reasons.

Something most people don't take into account is that MOST LCD MONITORS ONLY PUT OUT 60Hz or 60FPS. Therefore even if your CPU AND GPU can output 1000FPS your monitor will only show 60FPS. That is the biggest bottleneck in MOST high end machines with less than enormous monitors.

To FURTHER muck things up, the eye cannot tell the difference past 75FPS as an organ and cannot distinguish between 14 individual images per second and and real motion. As far as rendered images go, the lowest playable FPS is ABOUT 30-40 FPS for various reasons that I don't feel the need to go over, simply knowing that 30-40 FPS are what we need for fluid game play is enough. So, if you get 10000 frames flashed at you in a second you will see the same thing as if you saw 40 frames flashed in the same amount of time of the same video as before. I will mercifully IGNORE this problem and focus only on the bottlenecks on the hardware level.

As stated before, the general rule FOR ALL LCD MONITORS is that on its max resolution, a monitor will only be able to output 60FPS. For the sake of this argument, lets assume that 60Hz is the max FPS a monitor can put out across it's entire resolution range since we're assuming that the monitor you own has the max resolution you're going to use. This is a HUGE and completely immovable bottleneck to the system. 60FPS.

What I am about to post is a series of charts and explanations, produced by myself, that illustrate how bottlenecks SHOULD be taken care of, rather than how the parroted response around this forum says to, despite my best efforts to the contrary.

attachment.php


Please note that ONLY THE FPS AT WHICHBOTH THE CPU AND GPU CAN OUTPUT IS WHAT YOU GET COUNTED AS A FINAL RENDERED IMAGE. The curve generated for the GPU is based on the raw pixel load caused by higher resolutions and its relative load to 800x600 resolution. I tuned it fairly closely approximate the 8800GTX numbers found on the FEAR bench in this review found HERE. The CPU load does not change from resolution to resolution as it only deals with AI/physics/plain calculation for the game and whose FPS is therefore effectively a FLAT LINE at whatever speed it's capable of. Clearly it's at about 175 FPS from the data in the FEAR test. The load on the GPU is progressively greater as the resolution goes up. The FPS at 12x10 and above clearly and closely follow the curve I derived solely on pixel load. So, the point at which the GPU curve and the CPU curve intercept (which is very close to 12x10) is where NEITHER THE CPU NOR THE GPU IS BOTTLENECKING THE OTHER. This point will be referred to as the CPU-GPU bottleneck point (CGBP) from now on. The intercept of either the CPU or GPU with the monitor refresh rate is another point of bottleneck. What you will also notice is that THE 60 Hz MONITOR REFRESH RATE IS THE BOTTLENECK AT 60FPS ALL THE WAY UNTIL THE INSANE RESOLUTION OF 2560x1600!!! This effectively renders the CPU COMPLETELY IRRELEVANT as it's ALWAYS capable of a FPS that is higher than the monitor's 60FPS. This monitor/GPU intercept will be referred to the true-bottleneck point (or TBP) from now on. We can nowcompletely ignore the CPU in all instances where the CPU can only bottleneck a GPU at ABOVE 60FPS which is pretty much ALL cases!

Proof? http://www.benchzone.com/page.php?al=8800GTX_cpu_scaling&pg=1

attachment.php

attachment.php


The above two images illustrate that, essentially, no matter how slow your CPU is, it will never be the bottleneck. The CGBP will increase but will not change the TBP until it drops below 60FPS from a slow CPU or GPU. Logic tells us that unless your CPU is CURRENTLY BOTTLENECKING your games at lower than 60FPS it will NEVER EVER be the true bottleneck in your games.

That's a huge step is it not? It just told every nearly EVERY CPU owner than upgrading their CPUs is COMPLETELY VOLUNTARY and absolutely not necessary because their monitors bottleneck at a point so much lower than the CPU it's pointless to even think about!

What you can do to eliminate the monitor bottleneck is increase your resolution to 2560x1600 or the notch below, in other words, by a huge-ass monitor. You'll not only lose the bottleneck but gain a GIGANTIC and VERY BEAUTIFUL game image!

attachment.php


As you can see here, we simulate a slower GPU (or increasing the GPU load but that comes later) by dropping the GPU curve and now the CPU/GPU bottlenecking point shifts dramatically back to 800x600 resolution! You'll also notice that the Monitor/GPU bottleneck point also shifts WAY back all the way to ~1600x1200!

So, we just relearned that that your monitor bottlenecks your GPU. We've also established that a slower GPU drops the TBR significantly. (This next part is also for those of you saying to yourselves "Hey, but I don't have enough $$$ for a 2560x1600 monitor to reach my TBR.") Now, we must look at matching the output of the GPU with the refresh rate of your monitor. How do we do that? One of three ways.

1) Slow down your GPU. This is stupid. Why would you spend $$$ on a kickass fast GPu just to slow it down for a bottleneck? F%^& that noise.

2) Raise your monitors refresh rate. Essentially all LCD monitors are 60Hz at their max res, this is impractical.

3) Lowering the curve of your GPU until it's output is nearly the same as your monitor's.

attachment.php


Revisiting the last chart, you'll notice that lower the curve, the lower your TBP. Lower the curve can NOT ONLY be done by slowing down your GPU but also by INCREASING THE LOAD on your GPU. You need to effectively force your GPU to do more work at the same resolution. This amounts to, "TURN UP THE EYE CANDY TO MAX BABY! YEAH!" So, we've established that buying a bigger prettier monitor and upping eye candy (HDR/AA/ANISO/motion blur) eliminate your monitor/gpu bottleneck. Wow, isn't that awesome? The cure for a bottleneck is to make everything PRETTIER? That's the greatest idea ever. Buying a new faster CPU will only exacerbate the difference between what your cpu/gpu can output and what your monitor can. To hell with buying a new CPU that will only make allow my GPU to render images THAT NO ONE WILL EVER SEE! That's right, your GPU be rendering images, just to erase them because your monitor cannot display as many as it puts out! Why would you even do that? WHY?! It's utterly pointless.

So, go ahead and buy your 8800GTX... but don't get a Conroe/MoBo/DDR2 too. Just save the $500 from the proposed upgrade and by a huge monitor and CRANK the EYE CANDY!
 
Last edited:

diduknowthat

formerly liuliuboy
You seem to be forgetting that not all games run at a constant FPS. When shit goes down in games and everything is going crazy your FPS is going to take a HUGE dip.

My computer runs Crysis at High with around 60-70 FPS if I'm just standing there looking at trees swaying. However when there's 10 people running around trying to shoot me my FPS drops to ~30-40.

If you buy hardware that can just crank out 60fps, chances are that they'll be many times where the system can't handle the extra load and you will LAG.
 

TFT

VIP Member
Good article :good:

It's been known for some time that if the FPS is higher than the display refresh rate (normally 60Hz) then those frames are lost and "tearing" can happen.

Buy one of these at a true 120Hz, Samsung SyncMaster 2233RZ.
 
Last edited:

Ihatethedukes

New Member
This is actually a really old article written by myself. The CPu FPS has since been shown not to be a straight line so much as a downward curve with MUCH less slope than the GPU. AA has also been shown to take a toll on the CPU limited curve too.

Tearing ALWAYS happens when vsync isn't on and you're not getting exactly 60FPS, period (assuming your LCD has a 60hz refresh). It's all a matter of where is tears (top, bottom, middle) and how often. I'll dig up another bottleneck article I did that it a bit more sophisticated.

Also, the advent of the 5-series GPUs and 120hz LCDs really changes things.

The comment made about FPS being dynamic is a good one. However the CPU bottleneck people talk about is not often what causes that. (if you have evidence otherwise please show me.) Perhaps another run looking at minFPS would be as/more informative. (the bench I used doesn't show mins)
 
Last edited:

linkin

VIP Member
thanks for the article. however every monitor is different. eg: my old 22" CRT culd run 1600x1200 with a 95hz refresh rate. my current 17" lcd runs 1280x1024 with 75hz.

:good:
 
Top