How much GPU resources does cloning use?

WeApOn

New Member
I have 4 monitors connected to my PC currently and one of them is a TV. I was thinking of using that monitor to clone my primary monitor, but was worried about performance.

Does cloning simply mirror an image and only render it once, or is it requiring my GPU to render the image twice on different screens?

I'm doubting that it as inefficient as it may sound, but would love to get an idea of just how much resources it is taking from my GPU -- I don't want it to detrimentally effect my video card when I am doing something intensive.
 
I wouldn't worry about it. I have run dual screen on a laptop with onboard graphics of something like 64mb and it has run super fast.
 
Any other opinions? Rendering video in 1920x1080 on two screens sounds like it would at least have some type of effect.

I suppose otherwise I could benchmark with and without cloning and report back, assuming nobody here has tested this before.
 
It wouldn't use hardly anything I would have thought. If a raspberry pi with an ARM processor renders 1920x1080 by default on 700mhz and doesnt go above 5% idle. Or near its ram usage at all, so I doubt it will affect a graphics card with more than 64mb Ram in it.
 
Ran some benchmarks last night --- cloning my screen to another had a very minimal effect. I will be doing further testing but it certainly seems like there is not a performance hit for the most part.
 
There should be near enough zero. It renders the image once and sends it to two outputs rather than process it for each separately
 
There should be near enough zero. It renders the image once and sends it to two outputs rather than process it for each separately

From what I have seen. Yeh, I can back you up on that!

Any other opinions? Rendering video in 1920x1080 on two screens sounds like it would at least have some type of effect.

I suppose otherwise I could benchmark with and without cloning and report back, assuming nobody here has tested this before.

Aastii has the right of it. As long as the information is the same. You would be fine. The video card simply converts the data into binary, which it sends to the monitors and they convert that into the signal for the screen (Don't quote me on that). As long as it is the same resolution. Then you are fine. :D
 
Back
Top