How to run crysis at Very High

Archangel

VIP Member
AA is going to reduce your FPS majoly, so don't use it, it basically softens the edges of things to make them look better.

Thats AF actually (calculate pixels along edges to mane a 3d object appear more smooth ) AA is a means of filtering textures, to reduce the weird effect you can get when moving around.

stuff like this:
Moire_pattern_of_bricks_small.jpg

(that effect is called aliasing, it happens when you compress an image, wich happens when you change the perspective you look at something in a 3d application all the time :) (with an increased samplerate of the original picture, the card can calculate the compressed picture much better again, wich takes a lot more resources tough ^^)
 

zaroba

Member
The computer with the best Graphics setup in the world can not play it with repspectable fps when maxed out(1920x1080 4xAA).
http://www.bit-tech.net/hardware/2007/12/13/nvidia_3-way_sli_on_nforce_680i_preview/5

lol, hardly the best. with high settings, my pc can get 20-30 fps.
i really hope you wont believe that having three 512mb ultras will only provide a 10-20 fps increase over a single 640mb gts.

most of the time these people who do benchmarks are brainless when it comes to comparing things and testing whats needed for things (look at the vista vs xp benchmark they did after completely ignoring system requirements). i'll bet that pc only has 1gb of ram in it, maybe 2gb, and that is why there fps is quite low.
 

Froboy7391_99

New Member
lol, hardly the best. with high settings, my pc can get 20-30 fps.
i really hope you wont believe that having three 512mb ultras will only provide a 10-20 fps increase over a single 640mb gts.

most of the time these people who do benchmarks are brainless when it comes to comparing things and testing whats needed for things (look at the vista vs xp benchmark they did after completely ignoring system requirements). i'll bet that pc only has 1gb of ram in it, maybe 2gb, and that is why there fps is quite low.

But your not running on 1920x1280 are you? They had a OCed Q6600 and 2 Gb of ram
 

Iluvpenguins

New Member
lol, hardly the best. with high settings, my pc can get 20-30 fps.
i really hope you wont believe that having three 512mb ultras will only provide a 10-20 fps increase over a single 640mb gts.

most of the time these people who do benchmarks are brainless when it comes to comparing things and testing whats needed for things (look at the vista vs xp benchmark they did after completely ignoring system requirements). i'll bet that pc only has 1gb of ram in it, maybe 2gb, and that is why there fps is quite low.

Is that a joke? there is no way in hell you're running this game at the resolution with the aa and all that good stuff they ran at on a 640mb 8800GTS. You might have the settings on high, but you probably don't even have AA turned on, and you're probably playing at 1024x768.
 

zaroba

Member
Is that a joke? there is no way in hell you're running this game at the resolution with the aa and all that good stuff they ran at on a 640mb 8800GTS. You might have the settings on high, but you probably don't even have AA turned on, and you're probably playing at 1024x768.

the only difference is that i can only get upto 1280x1024. personally, i've never seen an fps difference with resolution changes with this pc. i tested and found that crysis with 800x600 res is the same fps as 1280x1024 res. I'd go ingame and take some pics as proof, but my power supply died last night so i have to wait until the new one comes from newegg



But your not running on 1920x1280 are you? They had a OCed Q6600 and 2 Gb of ram
they only had 2gb of ram in that pc? lol, that is exactly what i'm talking about. they would have had possibly double the fps if they had maxxed out the ram in the pc.
 
Last edited:

Archangel

VIP Member
2Gb of RAM is actually enough.. when playing crysis (with 2gb) the usage of if, at least for me, never went over 80% ram used.

anyways, there's a slight difference on the workload for the grafic's cards here ;)
1280x1024 = 1310720 pixels needing to be calculated.
1920x1280 = 2457600 pixels, the load on the grafix's cards is at least twice as high :) (considering it also does full screen AA and AF etc in some tests)
 

Dollar

New Member
I am using the follwing rig and I am playing at 1024x768 all on medium settings and fps is 20 to 30. It is very very lag sigh. And there are also freezing sometimes in the mniddle of somewhere.
 

Iluvpenguins

New Member
if you turned up AA or you are using your own settings via nvidia panel, thats why. You need to force vsync off and you need to make sure that AA is application controlled or turned off.

and as Archangel said, 2GB of ram is enough for the game, 4gb of ram wouldn't of doubled fps let alone change it. If any change, it'd be a 1-2fps difference most likely...because its not going to utilize more than 2gb anyway.
 

zaroba

Member
just because it wont use 100% of the ram doesn't mean it wont make a difference.
in every instance i have seen, it actually HAS made a difference.

hmm..oblivion...1gb+xp = maybe it was 10fps with maxxed settings, vista + 3gb = around 20fps with maxxed settings, vista + 5gb = 30+ fps with maxxed settings and modded to remove LODs and add higher definition models etc. besides FPS, there were also drastic increases in load/save times, and map transition times.
everything else in the pc was exactly the same.


anyways, there's a slight difference on the workload for the grafic's cards here ;)
1280x1024 = 1310720 pixels needing to be calculated.
1920x1280 = 2457600 pixels, the load on the grafix's cards is at least twice as high :)

yea, just like the load between 800x600 and 1280x1024 is drastically different, yet there was no fps effect.
ram really does help a graphics card work better.
 
Last edited:

Motoxrdude

Active Member
just because it wont use 100% of the ram doesn't mean it wont make a difference.
in every instance i have seen, it actually HAS made a difference.

hmm..oblivion...1gb+xp = ok, vista + 3gb = better, vista + 5gb = yet better.
everything else in the pc was exactly the same.
Where's your proof?
 

zaroba

Member
erm...because i was playing oblivion on those things.

like i said, every time i have seen, it actually did make a difference. if your not even going to bother trying it, then don't sit there and act like i'm lying just because your too stubbern to upgrade.
 

zaroba

Member
thats a shame.

what else was in your computer at the time? maybe something else was limiting it.
when i did it, the other main things in my pc were an IDE hard drive, 640mb 8800GTS, and an E6300 @ 1.86Ghz.
 

Motoxrdude

Active Member
Keep in mind, this is when oblivion was first released. I had a x2 3800+, x800gto, 1gb dual channel (then 2gb dual channel).
 

zaroba

Member
ahh, could have to do with the newer cpu/gpu. maybe they were able to push the game alot farther.


last gaming pc i had also displayed a ram related fps increase.

long, long, long ago (around 2003). 2.4ghz p4, win2k, PCI 64mb GeForce2 mmx, Asus P4C800 Deluxe Mobo.
going from 1gb of ram to 4gb of ram resaulted in an FPS jump as much as %200 in some games (seriously, 20->60fps). I then replaced the GeForce2 with an 8x AGP 256mb GeForce 4 5600. FPS in games changed by not even %10. I was so ungodly pissed off that the card i just spent some $200 on barely did squat in my games. i was considering returning it, but i needed its pixel shader 2.0 for a game i had recently gotten.
 
Last edited:

Motoxrdude

Active Member
ahh, could have to do with the newer cpu/gpu. maybe they were able to push the game alot farther.


last gaming pc i had also displayed a ram related fps increase.

long, long, long ago (around 2003). 2.4ghz p4, win2k, PCI 64mb GeForce2 mmx, Asus P4C800 Deluxe Mobo.
going from 1gb of ram to 4gb of ram resaulted in an FPS jump as much as %200 in some games (seriously, 20->60fps). I then replaced the GeForce2 with an 8x AGP 256mb GeForce 4 5600. FPS in games changed by not even %10. I was so ungodly pissed off that the card i just spent some $200 on barely did squat in my games. i was considering returning it, but i needed its pixel shader 2.0 for a game i had recently gotten.

Sorry, but i seriously doubt that in 2003, you had 4gb of ram. For one, windows xp 64bit was released in 2005, so you could only use 3gb, and no desktop motherboard supported 4gb of ram in 2003.
 

Archangel

VIP Member
yea, just like the load between 800x600 and 1280x1024 is drastically different, yet there was no fps effect.
ram really does help a graphics card work better.


My gues would be the fps gets limited at one point. whats the point of going over 70fps for example. I mean, great, the grafic's cards would run at full load all the time, even on low setting, good for heating up your room I gues.

I know what you mean, don't get me wrong there. but when I set it to 800x600 or 1280x1024, I dont notice a difference either. but when I turn it up to 1600x1200, the fps drops really low. so yes, there definately is a difference between them
 
Top