Official 3DMark06 Rank Thread

Status
Not open for further replies.
I have a 900watt PSU and I do have some over clocking utilties for both my RAM and video card, as well as my proc. I just don't see it making my computer better other than benchmarking. I mean I already pull 60+ FPS on fallout 3, which is more than enough for me.

Well, with things like rendering and photoshop, it helps there, so I can see a difference. But, like you said, if you're already pulling a good FPS, then there isn't a real point. I still have CIE1 (I think that's what it's called) enabled, so the CPU Downclocks when it's not being used.
 
Well, with things like rendering and photoshop, it helps there, so I can see a difference. But, like you said, if you're already pulling a good FPS, then there isn't a real point. I still have CIE1 (I think that's what it's called) enabled, so the CPU Downclocks when it's not being used.

Well, SLI is just straight data through put, it is parallel processing. If it were distributed processing I could see it being way better. I think that multi-core GPUs will replace SLI very soon.

Even then, your average photoshop usage will not take advantage of your over clock. I think it is a niche market and only appeals to those who are hobbyists and like the horse power and like benchmarks and maybe like pushing their systems to the limits.

Otherwise, over clocking offers no real time or real world benefit for any average user or average professional user. I mean I occasionally use photoshop, illustrator and indesign, but none of my work is that complicated and anything I may be doing wouldn't benefit from it.

Also, over clocking makes it unstable, which is why you see render cards and xeon processors in real work-horse workstations. A Quadro card sucks bad for gaming, but is awesome for Maya or 3DsMax. Because it is designed to be slow and stable and crunch the numbers, where as a gaming card is meant to render polygons and calculate physics on the fly. Different animals, different user oriented goals.
 
I have a 900watt PSU and I do have some over clocking utilties for both my RAM and video card, as well as my proc. I just don't see it making my computer better other than benchmarking. I mean I already pull 60+ FPS on fallout 3, which is more than enough for me.

cant you force higher FA and AA? I mean, in my drivers I could turn things up, made the picture of IL2 1946 really pretty. but I only got between 10 and 30 fps (without anything happening except flying, depending on wich way I pointed the camera) note: thats a game from 2006, using an engine from the IL2 Forgotten Battles game (wich is from 2004) and that was with 2 3870's already.

so surely you can make the game look better so you wouldnt get the 60+fps anymore. (ofcourse, weither you'd want to is a different matter)
 
cant you force higher FA and AA? I mean, in my drivers I could turn things up, made the picture of IL2 1946 really pretty. but I only got between 10 and 30 fps (without anything happening except flying, depending on wich way I pointed the camera) note: thats a game from 2006, using an engine from the IL2 Forgotten Battles game (wich is from 2004) and that was with 2 3870's already.

so surely you can make the game look better so you wouldnt get the 60+fps anymore. (ofcourse, weither you'd want to is a different matter)

settings are maxed and I get 60fps, I think I am running it at 1440 x 900 res, 8x AA high quality textures and motion blur enabled, and all kinds of particle effects.

it will drop to around 40fps at certain parts of the game but for the most part it stays steady at 60
 
what level of AF? try forcing 16x AA and 16x AF in the drivers. you'll be amazed what that does for the picture quality:) (makes it quite heavy to run tough,.. sadly)
 
what level of AF? try forcing 16x AA and 16x AF in the drivers. you'll be amazed what that does for the picture quality:) (makes it quite heavy to run tough,.. sadly)

I am not sure I have tried that yet. I will try it when I have more time, and once i back up some data since I don't trust Windows.
 
Well, SLI is just straight data through put, it is parallel processing. If it were distributed processing I could see it being way better. I think that multi-core GPUs will replace SLI very soon.

Even then, your average photoshop usage will not take advantage of your over clock. I think it is a niche market and only appeals to those who are hobbyists and like the horse power and like benchmarks and maybe like pushing their systems to the limits.

Otherwise, over clocking offers no real time or real world benefit for any average user or average professional user. I mean I occasionally use photoshop, illustrator and indesign, but none of my work is that complicated and anything I may be doing wouldn't benefit from it.

Also, over clocking makes it unstable, which is why you see render cards and xeon processors in real work-horse workstations. A Quadro card sucks bad for gaming, but is awesome for Maya or 3DsMax. Because it is designed to be slow and stable and crunch the numbers, where as a gaming card is meant to render polygons and calculate physics on the fly. Different animals, different user oriented goals.

Well, obviously SLI won't see much of an increase in performance for 3DMax or Photoshop, but it'll help a little. Also, the clocks on the CPU will help me in 3DMax. Something that'd take 24 hours in University, would take me 10 hours at home. As I'm more into gaming than I am rendering, I didn't buy a Quadro, or a small render farm, lol.

Like Arch said, too. You can make things look better, and still get a high FPS. For example, when I play Brothers in Arms: Hell's Highway, I force X16AF and X8AA and still get 60-80FPS.
 
Scores updated.

Again, if you do not show me a screenshot which displays the resolution and what hardware you are running, I will not include it in the list. I am trying to keep this as official as I can.


Hey omega can you add me to the list?
When I click on the link I just go to the Orb homepage, try posting a screenshot showing me the resolution and the specs.
 
I'm pretty sure everyone uses the same resolution for the test. Anyways mine is default.

3dmark06.jpg
 
I'm pretty sure everyone uses the same resolution for the test. Anyways mine is default.

3dmark06.jpg
Again, I need the resolution.

The reason I need it is because people who use widescreen monitors would run the test at a lower resolution since they don't all support 1280x1024, which makes a difference of a few hundred to a few thousands points.
 
Status
Not open for further replies.
Back
Top