Seriously, I can get 90fps for old games running at native resolution on iGPU, but the game looks like crap. It's the way the iGPU shades then and such.
So my question is, most modern machines are capable of running UT '99 at 300fps no problem. But most modern machines come with iGPUs. Is there a way to use the excess CPU power to pretty up the game a bit (discreet graphics card emulation)?
Sure, I can play UT '99 on an old tower desktop with an ATI Radeon card and experience its full vibrant beauty. But those things suck up so much electricity. I can play the same game on my laptop, which uses about 1/8th the juice of the tower desktop. But like I said earlier, the same game looks like crap on the laptop b/c of the iGPU...
EDIT: I've just checked my iGPU and it can't do fullscreen AA/AF. All I'm really asking is, since my hardware doesn't support AA/AF, is it possible to do it via emulation? How?