Dell vs. iMac

tlarkin

VIP Member
^ Not arguing with that, but a 4850 is never going to be able to game at such a high resolution, and lower ones will look crappy on such a large screen...

A 4850 can run games on medium settings just fine. I have an x1900 in one of my laptops and it runs COD4 just fine, and it even runs MW2 just fine. On medium settings with some shadows turned off it gets over 35FPS, which is easily playable.

I am not saying the iMac is the best computer to buy for gaming, but to say it cannot game is just outright incorrect.
 

JTM

New Member
If the OP is using the PC for video/photo editing, I would go with the iMac. The LED screen will make things more enjoyable. And the 4850 can play most games out there nicely.
 

Drenlin

Active Member
A 4850 can run games on medium settings just fine. I have an x1900 in one of my laptops and it runs COD4 just fine, and it even runs MW2 just fine. On medium settings with some shadows turned off it gets over 35FPS, which is easily playable.

I am not saying the iMac is the best computer to buy for gaming, but to say it cannot game is just outright incorrect.

It can play them for sure, but at what resolution? That monitor's native res is huge.
 

diduknowthat

formerly liuliuboy
A 4850 can run games on medium settings just fine. I have an x1900 in one of my laptops and it runs COD4 just fine, and it even runs MW2 just fine. On medium settings with some shadows turned off it gets over 35FPS, which is easily playable.

I am not saying the iMac is the best computer to buy for gaming, but to say it cannot game is just outright incorrect.

It can, but it's not that good at it. The 4850 is going to have a hard time driving a monitor of such high resolution, especially if he's going to run demanding games.
 

tlarkin

VIP Member
It can play them for sure, but at what resolution? That monitor's native res is huge.

The resolution is not really the issue as much as how high the textures are set to render, the shading, the lighting effects (like shadows) and so forth. The Xbox 360 has what video card in it again? It runs 1080P, the PS3 does as well, and it does not have a current generation video card in it either.

The truth is, most of the video game market is marketing, and hype. Only if you want to be cutting edge do you need to latest and greatest. I have a GTX 260 and when I bought it, it was a step down from the current high end. It still plays all games maxed out being 2 years old on my 22" @ 1680 x 1050 so not quite 1080P but almost near. If I had a bigger monitor it would display it no problems.

There is really no valid reason to be getting over 100 frames per a second in a video game other than bragging rights.
 

bomberboysk

Active Member
The resolution is not really the issue as much as how high the textures are set to render, the shading, the lighting effects (like shadows) and so forth. The Xbox 360 has what video card in it again? It runs 1080P, the PS3 does as well, and it does not have a current generation video card in it either.

The truth is, most of the video game market is marketing, and hype. Only if you want to be cutting edge do you need to latest and greatest. I have a GTX 260 and when I bought it, it was a step down from the current high end. It still plays all games maxed out being 2 years old on my 22" @ 1680 x 1050 so not quite 1080P but almost near. If I had a bigger monitor it would display it no problems.

There is really no valid reason to be getting over 100 frames per a second in a video game other than bragging rights.

False, although that really does depend upon the person playing. The higher number of frames per second, the more actions as a result of your input occur, and the entire game will feel more fluid and lifelike. However, its not something that most people will notice...would love to see a double blind type of test on framerates...

As far as Xbox 360/PS3 games being 1080p....lets remember that many "modern" games on the 360 and PS3 are not even 1080P, a good example is modern warfare 2, which is only 600P native res, anything above that is upscaled due to graphical limitations of the systems. Also, consoles, unlike PC's, are one set of hardware, with no performance deviations between consoles...so the software can be much more optimized for that specific hardware.
 

tlarkin

VIP Member
False, although that really does depend upon the person playing. The higher number of frames per second, the more actions as a result of your input occur, and the entire game will feel more fluid and lifelike. However, its not something that most people will notice...would love to see a double blind type of test on framerates...

As far as Xbox 360/PS3 games being 1080p....lets remember that many "modern" games on the 360 and PS3 are not even 1080P, a good example is modern warfare 2, which is only 600P native res, anything above that is upscaled due to graphical limitations of the systems. Also, consoles, unlike PC's, are one set of hardware, with no performance deviations between consoles...so the software can be much more optimized for that specific hardware.

citation needed that humans can tell the differences between 70 FPS and 100 FPS and 150 FPS

The human eyes and brain can only actually process about 25 to 30 frames per a second in actual life. There are exceptions, like Babe Ruth. They did a study on him and found his reflexes to be over double that of your average person. Which probably made him such a great hitter in baseball. However, not everyone has the reflexes of the Great Bambino.

However, your average person is not going to notice or take advantage of the difference. I also said you don't need it, and you are saying you do need it?

I was putting it at a level of necessity, if you got back and reread what I posted.

As for your last comment about OS overhead, I could use that to argue that a Mac with lesser hardware could perform on par with a PC. However, not going to go there.
 

bomberboysk

Active Member
citation needed that humans can tell the differences between 70 FPS and 100 FPS and 150 FPS

The human eyes and brain can only actually process about 25 to 30 frames per a second in actual life. There are exceptions, like Babe Ruth. They did a study on him and found his reflexes to be over double that of your average person. Which probably made him such a great hitter in baseball. However, not everyone has the reflexes of the Great Bambino.

However, your average person is not going to notice or take advantage of the difference. I also said you don't need it, and you are saying you do need it?

I was putting it at a level of necessity, if you got back and reread what I posted.

As for your last comment about OS overhead, I could use that to argue that a Mac with lesser hardware could perform on par with a PC. However, not going to go there.
There really is no citation needed...its easily explained by the fact that more inputs and outputs every second is going to result in a more lifelike representation of movement(aka, less input lag)... I highly doubt there have been many studies comparing video game fluidity between 30fps and 60fps and 100fps and so on.

And, although there may be less overhead on OSX...you miss out on many windows features aimed at graphics, such as directX(which arguably there is OpenGL, but is not really as optimized when it comes to games IMO).
 
Last edited:

tlarkin

VIP Member
You can definitely tell the FPS difference up to a point. I am certain if I gave you all the Pepsi challenge and sat you down in front of a machine that played a game at 75 fps and one that did it at 100fps you wouldn't really tell the difference. After a certain point you cannot process that much more.

Also, not everyone wants the most cutting edge. The highest end video cards are damn expensive. Developers do not code games that require that hardware.

As for OS X it fully supports Open GL and has a set of APIs developers can use to access Open GL with in the OS called Core Animation. Direct X isn't the only game in town, but it is highly more used than anything else when it comes to gaming.
 

bomberboysk

Active Member
You can definitely tell the FPS difference up to a point. I am certain if I gave you all the Pepsi challenge and sat you down in front of a machine that played a game at 75 fps and one that did it at 100fps you wouldn't really tell the difference. After a certain point you cannot process that much more.

Also, not everyone wants the most cutting edge. The highest end video cards are damn expensive. Developers do not code games that require that hardware.

As for OS X it fully supports Open GL and has a set of APIs developers can use to access Open GL with in the OS called Core Animation. Direct X isn't the only game in town, but it is highly more used than anything else when it comes to gaming.
Crysis, Far Cry 2, Flight Sim X(for anywhere near high settings anyhow), Battlefield 2(in its day), Doom 3(in its day) are all examples of games in which developers coded them for platforms far more powerful than what a standard person would have. I do however agree you have an entirely valid point, as those games are only a small sector of the market.

As for DirectX, its sort of what i was saying yet not quite. Due to DirectX being more prevalent, many of the features in OpenGL are not fully utilized, hurting gaming performance when running on OSX, but its sort of a non-mention when running windows on an apple machine.

And again, like i said, visually....you won't notice anything above 60fps on most any display anyhow, as most LCD TN panels are 60hz. However, even when the display has a 60hz refresh rate, you will still end up with a noticeably more fluid gameplay...not visually, but the actual "feel".
 
Last edited:

speedyink

VIP Member
The human eyes and brain can only actually process about 25 to 30 frames per a second in actual life.

That sounds pretty low...Pretty sure I can easily tell the difference between 30 fps game and 60fps game. In fact it's about the late 40's early 50's where it gets hard to tell. For me at least.

Either way, Macs lack newer video hardware, and you'll never see a hard core gaming mac.
 

tlarkin

VIP Member
That sounds pretty low...Pretty sure I can easily tell the difference between 30 fps game and 60fps game. In fact it's about the late 40's early 50's where it gets hard to tell. For me at least.

Either way, Macs lack newer video hardware, and you'll never see a hard core gaming mac.

I am not disagreeing with your statements, I am saying that hardcore gaming, is well, mostly marketing and not a lot of facts back it up. They market it to enthusiasts and the like who will drop lots of $$$ to get the so called, higher end performance.

I don't game on my Mac, but I have steam on it, and it works and it runs all the games it can play just fine. My Macbook Pro is almost 3 years old.
 

speedyink

VIP Member
I am not disagreeing with your statements, I am saying that hardcore gaming, is well, mostly marketing and not a lot of facts back it up. They market it to enthusiasts and the like who will drop lots of $$$ to get the so called, higher end performance.

I don't game on my Mac, but I have steam on it, and it works and it runs all the games it can play just fine. My Macbook Pro is almost 3 years old.

Sure it can be with some people, but that doesn't make the 4850 any better. I spent less than $1000 on my gaming rig, now 2 or so years old, which is about on par with what the new imacs can do gaming wise

Running steam ain't an accomplishment.. I played those games back in my radeon x300 days :p
 
Last edited:
Top