What determines the FPS??

Vizy

New Member
I was just wondering what determines the FPS. Is it the graphics card? or is it the game? Let me show you a screen shot real quick: See what happens is that my FPS never go higher than 30 fps! I'm using integrated graphics, ati radeon 1200, and i never get above 39 fps. To me it looks like the gameplay is perfectly smooth, so its not a real issue. I was just curiouis.
 

Langers2k7

New Member
Your framerate is a result of the performance of your whole computer - every part contributes towards it. However, the biggest bottleneck is caused by your graphics card or your CPU.
 

Archangel

VIP Member
I 2nd that :) it is determined by your whole computer. (the grafic's card usually is the part holding you back from getting higher fps's tough ;) )
 

evil-xxx

New Member
I dont know why some games limit the max fps such as 30fps max in resident evil 4.
I get faint and wanna puke after a short play...
 

hermeslyre

VIP Member
RE4 was designed for consoles, 30 FPS was probably the most the Gamecube could spit out. When ported over to the PC, that restriction of FPS must not have been removed.

I played Re4 on the PC, 30 FPS was fine. Maybe it was just the over the shoulder view..? Or the muddy graphics, lol. I modded Re4 to the limits, mostly graphical tweaks, high resolution GC textures, high res meshes, etc.
 

evil-xxx

New Member
yep,thanks to it's view or else I cant stand it if it's first personal view.The game got suck texture but the model is ok.however RE4 is bloody kicking ass.
 

Geoff

VIP Member
Nothing wrong with 30. Last I checked, films played at 27 fps. If it feels playable don't worry about it.
People are always getting these sorts of answers mixed up. Movies play at 29.97 FPS, however each frame is blurred into the next frame if you take a screen shot, however games are displayed like photographs, where if you are playing a game at 30FPS, only 30 images are being displayed per second.

While games may be playable at a constant 30FPS, when the framerate fluctuates you will notice a lot of lag, especially if it dips below 30FPS.
 

evil-xxx

New Member
[-0MEGA-];906623 said:
People are always getting these sorts of answers mixed up. Movies play at 29.97 FPS, however each frame is blurred into the next frame if you take a screen shot, however games are displayed like photographs, where if you are playing a game at 30FPS, only 30 images are being displayed per second.

While games may be playable at a constant 30FPS, when the framerate fluctuates you will notice a lot of lag, especially if it dips below 30FPS.

ok,looks you are pretty pro on video tec.could you answer me a question puzzling me for long?thx.:p
 

evil-xxx

New Member
as we all know the usual freshrate for lcd is 60hz,which means 60 images per second.of course you can switch your crt to 60hz which means the same as lcd.
so the question is,even though the monitor can only show us 60 images per second,why can we feel obviously smooth with the games which got fps higher than 60???
 

hermeslyre

VIP Member
as we all know the usual freshrate for lcd is 60hz,which means 60 images per second.of course you can switch your crt to 60hz which means the same as lcd.
so the question is,even though the monitor can only show us 60 images per second,why can we feel obviously smooth with the games which got fps higher than 60???

Butt in here, If I may? Your first assessment that VRR equals maximum FPS is correct, with that in mind, how can a game show obvious difference, on a lcd, between 60fps and say 100? It cannot. Could be that people who see the difference are just looking too hard, if you get my meaning.

Average 30fps in a game is fine. Dips are obviously going to happen, unless you're extremely anal, you can get through it, with a minimum of self-inflicted damage. I've not got a top of the line card, so I play games in the 20-25 fps as much as I can, pushing the settings till it hits that. Once you stop letting the disappointment of low frames get to you, you tend to stop noticing those little bumps that happen along the way. My opinion.
 

evil-xxx

New Member
welcome everyone join discussion lol.
I cant agree with you exactly...have you ever played quake3 or unreal tournament?even my sister can tell the difference between 70+fps and 200fps.in my opinion it is obvious especially in furious game.
 

hermeslyre

VIP Member
I've especially heard people citing a very fast game like UT. I agree that UT is fricken fast, and 30fps will not cut it for smooth gameplay.. But this is how I understand the rest; The video card produces a frame and all elements of it are stored in the cards frame buffer, which is to say a non-static portion of memory, capable of holding more than one frame at a time. The monitor requests a new frame on the refresh rate signal, VRR being 60hz, every 16ms. It is pulled from the buffer and uses a rendering mode known as rasterisation to "paint" the image on screen.

All this mumbo jumbo said.. I can't see that's there would be a noticeable difference like stated, unless using a CRT with refresh rates capable of exceeding 100hz. Not that I'm saying you're wrong, Just rambling what I know and think. I've actually spent some time searching for a definitive answer on this, and come up far short. People claim they can see the difference between 100 and 200, no-one argues technically but anthropological restrictions, I can't understand it.
 

fortyways

banned
I just turn my settings down. Actually, the only game I play right now is BF2 and I get 70-130 fps in that on medium/low settings and 1440x900. No AA or anything.

I also sometimes turn settings up to high and play at 30fps depending on my mood.
 

Narzinor

New Member
im fairly sure alot of faster games look much much better with higher framrates than 60 is that when it says 60, its showing you an average FPS. areas with more detail and faster rendering would slow down (even if for half a second) to below 60 and make it look different. rather than an 80 FPS where you have 20 frames average of leeway before it might dip below that noticeable 60 frames. I could be wrong, but it makes sense to me :)
 

Darkserge

New Member
You wont see different if your FPS gone more than 60 FPS.

30FPS is good.
60FPS is the best.
much way more than 60FPS are you nut?
 

Geoff

VIP Member
welcome everyone join discussion lol.
I cant agree with you exactly...have you ever played quake3 or unreal tournament?even my sister can tell the difference between 70+fps and 200fps.in my opinion it is obvious especially in furious game.
The problem most likely isn't that theres a noticeable difference between 70 and 200FPS, but rather that when you have an average of 70FPS, it often dips down to the 50's and 60's.

You should turn VSYNC on if you want to have a more stable framerate (if your computer can already handle 60+ FPS).
 

Vizy

New Member
Wait, sorry, but r u guys saying NO MATTER WHAT Halo, the 1st once, will never display more than 30 fps.
 
Top