Has this been thought of?

Darren

Moderator
Staff member
I remember recently reading about how old tube TV's worked. They would fire every other line of pixels and have 2 separate videos running at 15fps each but switching off. When combined together and with a delay they would form a seamless picture.

Could this somehow be applied to video rendering? Since many monitors have a high resolution, you could do exactly that. Every other frame would render an opposite line. Wouldn't this allow you to run graphics at half the required bandwidth as normal. This might cause some weird graphical issue I'm not thinking of and you'd have to make sure that pixel would stay lit as the one on the opposite line would be rendered.

Am I insane, thinking of a already butchered idea, or what?
 
Well this would require the same data to be processed at the same rate and the only difference would be the output signal from my thoughts.
 
Sure you may still be rendering 30 fps if you look at it that way, but you'll be doing half size picture.

Ok lets lets say 15 frames per second for 2 separate "videos" for lack of a better word. Each video has 50 lines of pixels with a total of 100 lines.

Scenario 1
30 frames per second across 100 pixels = 3000 pixels to render in one second.

Scenario 2
15 frames per second * 50 = 750
Now you would do that twice to end up with 30 frames per second but have every other line be done every 15 fps. So the screen refreshed the image at 30 fps. But each half at 15 fps. But still 750 * 2 = 1500.

1500 is half of 3000.

So rendering same amount of video at same framerate for half the power.
 
although when you instantly turn around in many games your framerate drops, because it's rendering something it wasn't before, which now every frame is something it didn't render in the last, idk, just doesn't seem like it would wok well. Plus if there are people sensitive to microstutter i think they'd be sensitive to this
 
I know this thread is really old but considering I'm OP I'm gonna necro it.

I'm sure there's something wrong with this idea but I can't think of what it is.
 
I think you're describing interpolation, which has the same effects as you predicted (such as 1080i using similar bandwidth as 720p).
 
In static images it would be fine. The issue is far more visible when viewing faster actions where the subject may have moved between the refresh. You end up getting tearing between the odd and even lines on the display

original.jpg
 
Back
Top