Fps

haha i didnt until i looked it up on that encyclopedia that i put a link to so much info on that thing its scary! everything you could want to know about anything! definitly a good thing to bookmark! another good site like that is howstuffworks.com amazing sites I learn so much everytime i log on them!
 
Yes the human eye on average can only see ~60FPS but I still see flicker when I run anything less than 85Hz. TVs and computer monitors although use the same technology work differently and their refresh rates shouldn't be directly compared. The higher the FPS a given game is running the smoother it will look, and some games look smooth at low FPS while others would be unplayable.
 
Acording to what i read, you can't tell the difference is FPS for antyhing about 40 on the average person, and don't say well my ugys must be better because i can tell the difference between 100 and 120. because, im sorry but you cant. more like many a 5FPS + or -.

The main reasons for trying to acheive higher FPS is

1.people think it matters.
2.future games will drop it lower, so if you can go double now, it'll be longer before you NEED to upgrade.
3.Just so say you can :cool:
 
Most people see flicker at less then 75Hz-85Hz, and it has nothing to do with how good your eyes are. Notice I'm not saying anything about FPS, I don't need 85FPS in say doom3 to not see flicker I need 85Hz on the monitor even if theres not a new frame to render.
 
TVs and computer monitors although use the same technology work differently and their refresh rates shouldn't be directly compared.
yupp thats true cromwell, but i just pasted a little bit of what that encyclopedia had to say.. above it was talking about the refresh rates for tvs and the 60hz is after conversion.

don't say well my ugys must be better because i can tell the difference between 100 and 120.
yupp! lol and if you dont have a 75hz monitor like most people its not going to matter how many fps you achieve above 75 lol!
 
also when fps fluctuates in fps games, the sensitivity changes because the sensitivity is based on pixels. you might notice sometimes your sensitivity changes when fps drops (atleast you do if you play competitively.)
 
true.. the fps changes in a game as you look in a different direction or move around in the game. i depends on what the grafics card need to "calculate"
the refresh rat of a monitor however says constant. :)
 
Yes, with my go 6800, if i play farcry on ultra high wtih low aa, it works perfect, but if i have it set to medium setting to save power and switch to medium aa, its playable, but i'd rather enjoy the gameplay with a lil bit more jaggies then play it slower with it really smooth.
 
Personally, I don't think that spending $600 on a new graphics card just to go from 50 FPS to 70 FPS in Doom 3 is even worth it. My Radeon X700 runs Doom 3 on high graphics and the FPS never drops below 30 FPS, which is absolutely fine for me.
 
most new crt monitors will go up to 120+ refresh rate using nforce or some other refresh rate unlock program.

gamers need very good video cards to stay a constant 100fps. if your video can only display 100fps standing by a wall, one smoke (in CS 1.6) will drop the fps to 10. real professional gamers (ie team3D or complexity) play for hundreds of thousands of dollars on the line. if the fps fluctuates, the sensitivity fluctuates, and their one shot headshots are no longer viable. :)
 
Back
Top