Crysis 3, poor framerate on a good computer

JoeRAB

New Member
Hello

I got Crysis 3 the other day, and despite me running it on the lowest settings, my fps is still dropping pretty low.

MB: Gigabyte Z68X-UD3-B3
Processor: i5-2500k 3.3GHz quad core
RAM: 11GB
GPU: Nvidia GeForce GTX 660 Ti
Anything else?

All the drivers are up to date, did a check yesterday.

Is there anyone else that has this problem? I've done some googling but haven't found anything on it

Any help would be good!

Thanks,
Joe
 
Wow. I have the same video card and am thinking of buying Crysis 3. I am interested if there are issues with this game. I hope it isn't another GTA IV.

There is no reason it should be running that bad.
 
Is motion blur off? I noticed motion blur would dip my frames down to 30-28 in places so I turned it down to medium and Im pretty steady at 50.

I dont think the 660ti is too far away from the 680 in terms of performance, so it should still be able to max it with acceptable frames I would think minus AA.

What resolution are you playing at? MAx at 1080p the game will use 2GB+ of VRAM, so if you have a 2GB or less card then that could be the issue there. It seems to be VRAM hungry at times.

Its certainly not another GTA4, the game seems to be fine in terms of optimization.

Also exactly what drivers are you using? Im using 314.07, which I dont think are the beta Crysis 3 ones they recommend.
 
Last edited:
A single 660 at max settings will get owned. My system gets owned at 1080p maxed out.

Crysis 3 is very hard on systems. Driver optimisations will help, but this is going to be like the first interation of the game, a hardware pwner.

Turn down settings. Also what exact psu you running?
 
are you using windows' driver updater? you have to go to nvidia's site and update from there, latest are like 314.XX. My 470 had issues with it, then updated for the beta and it would get high settings online and medium in campaign withought AA or AF
 
That is a very low resolution and framerate for that card. You should almost get 60fps nearly if not maxed. And you did update using nvidia's site? Not the windows updater crap?
 
That is a very low resolution and framerate for that card. You should almost get 60fps nearly if not maxed. And you did update using nvidia's site? Not the windows updater crap?

I agree. I saw a video on youtube of someone running a 660ti claiming to be getting 50fps on very high settings (which is max) at 1080p. I cant verify those were really the settings he was using, but his gameplay was smooth as ice in the video.

But in either case, the 660ti should not be getting bad framerates at low settings at that resolution. Something is certainly wrong.
 
I got the driver straight from the nVidia site, and all the settings are off, or lowest they can go. I just put a bigger SSD in for my OS, so I am reinstalling all the drivers and what not, so hopefully I may see a difference,
 
And? You turn 32x AA on and watch the FPS drop to low teens on a 660. Thats the point.

But it shouldnt define the maximum settings of a game because its not really a feature of the game engine. Depending on your monitor setup, 32x aa may not make any noticable difference to the naked eye at 1080p. Whereas, the difference between turning on and off particle effects or something like that will. Very high settings is the maximum setting of the game. Aa and af are added extras to any game delivered by the gpu not the engine.
 
There are really only two things that need to be considered:

1. Does it improve image quality? Yes, therefore is relevant to max settings. If you ran with no AA on but highest texture and feature settings, you are not maxing the game because it can look better with the settings provided. Only when it looks the best (ie, AA enabled) is it at max settings, max being MAXIMUM, highest, best. How can it be on max settings when it does not look the best?

2. Does enabling it decrease performance? Oh hellz yes. If you have AA all the way up and you are complaining about low FPS, turn it down and watch the FPS fly.

It doesn't matter how it is implemented (the engine does have to support it. You can't go playing Wolfenstein on a modern GPU, enable FXAA and think it will have perfectly smooth edges :rolleyes:), the fact that it is supported, which not all games do, and makes the game look better means it should be included when considering the settings and the impact changing them will have on performance
 
Last edited:
Yesterday i was messing with it, and FXAA has nearly no FPS hit but looks the same as 8X MSAA from what i saw, but 8X MSAA was unplayable. Not sure why I'm posting this, but in C3 it does a good job
 
Yeah its well scaled. Pretty sure the original one was too. Haven't played 2.

But at in game settings of max everything, it brings my machine to tears.

This game is pretty taxing.
 
It doesn't matter how it is implemented (the engine does have to support it. You can't go playing Wolfenstein on a modern GPU, enable FXAA and think it will have perfectly smooth edges :rolleyes:), the fact that it is supported, which not all games do, and makes the game look better means it should be included when considering the settings and the impact changing them will have on performance

Not entirely true. Some games which do not support AA can in fact have AA applied to them. For example, Madden 2004 which was released in 2003 does not have AA support however I can run 32x AA on it forcing from the driver and make sure there is not a single jagged edge with its limited 800x600 resolution. I posted screen shots of this in the screenshot thread a while back. But you can tell there isnt a single jagged edge, and at 800x600 that is quite a feat I think.

GTA4 being another example of this. I still insist that even with the mods, its not AA, however it does use blur effects to make the edges appear smoother.
 
Last edited:
Back
Top