Emulate old GPU?

hirobo2

Member
As many of you are aware, discreet graphics card do more than just increase fps, they make games look 10X better than something like intel integrated graphics.

So my question is, is there a way to emulate a discreet graphics card for old games such as Unreal Tournament '99 running on iGPU?
 
I don't really understand your question. Discrete GPUs can do things like AA and AF since they (at least modern ones) are a lot faster than iGPUs. You can enable the same settings on an integrated chipset but your FPS will crawl. For older games you might get an acceptable FPS, just adjust the video settings within the panel and choose things like Anisotropic Filtering and Anti Aliasing at max.
 
Seriously, I can get 90fps for old games running at native resolution on iGPU, but the game looks like crap. It's the way the iGPU shades then and such.

So my question is, most modern machines are capable of running UT '99 at 300fps no problem. But most modern machines come with iGPUs. Is there a way to use the excess CPU power to pretty up the game a bit (discreet graphics card emulation)?

Sure, I can play UT '99 on an old tower desktop with an ATI Radeon card and experience its full vibrant beauty. But those things suck up so much electricity. I can play the same game on my laptop, which uses about 1/8th the juice of the tower desktop. But like I said earlier, the same game looks like crap on the laptop b/c of the iGPU...

EDIT: I've just checked my iGPU and it can't do fullscreen AA/AF. All I'm really asking is, since my hardware doesn't support AA/AF, is it possible to do it via emulation? How?
 
Last edited:
honestly man I'm not really sure what you are getting at, other than trying to save battery life. I have never heard of emulating an older GPU. I'd say just enjoy running the game at 300 fps and plug the laptop in.
 
Some of us want to play games the way they were made to be seen on the highest spec'd machine from way back then without the accompanying electricity bill. Costs about $300 bucks to run a desktop 16 hrs/day for 1yr vs. about $70 bucks for a laptop hooked to an external monitor...
 
The reason old games look bad is not because you are using it on an integrated GPU, it's because the games are old and the textures that were packaged with the game were low quality due to the machine specs at the time, storage space, and there was simply no need to have life-like 4K textures on a 1990's game.

Playing an old game like UT even with an NVIDIA Titan X will not make the game look any better.
 
It's not the size of the textures. It's the way the graphics cards handles lighting and colors. Games do look more vibrant on a discreet graphics card than on iGPU.

The way to describe it is, guns look like paper cutout models with the appropriate textures on iGPU, but they look like actual weighty pieces of metal you can hold in your hand on discreet graphics.

iGPU is all about playing 1080p (was about) or 4K at the lowest cost. ATI/NVidia et al are all about image quality not resolution...
 
It's not the size of the textures. It's the way the graphics cards handles lighting and colors. Games do look more vibrant on a discreet graphics card than on iGPU.

The way to describe it is, guns look like paper cutout models with the appropriate textures on iGPU, but they look like actual weighty pieces of metal you can hold in your hand on discreet graphics.

iGPU is all about playing 1080p (was about) or 4K at the lowest cost. ATI/NVidia et al are all about image quality not resolution...

You have this all wrong. That has nothing to do with the video card being integrated or dedicated, it's all about the performance of the GPU. A current integrated GPU has much better graphics than a dedicated video card of several years ago.
 
Back
Top