DAC Latency?

Eduardo96

New Member
Most video cards, specially older ones, have an integrated RAMDAC that converts the digital signal from the card to an analog one that goes trought the VGA port to me monitor. That is not used in digital monitors. But the DAC has latency, it needs some time to convert from analog signal to digital.
Can this delay affect the performance of a computer during intense gpu usage? Does this mean that between two exactly equal computers with only a difference in the input of their monitor the digital one will exhibit more performace?
Usually the RAMDAC frequency is 400mhz.
 

Agent Smith

Well-Known Member
If you're referring to DVI, I think that is in fact digital. Even the letter D in DVI means digital. I haven't seen an analog display port since circa 2007.
 
DAC latency is darn near zero. Latency comes from the Windows audio subsystem.

Latency is not a problem when viewing videos or listening to recorded (or streamed) music.

It might be a problem with gaming. But it is a BIG problem when performing, and 40 msec is HUGE. Even 20 msec is annoying. Mine is at around 2 msec, which goes unnoticed.

If you need low latency (on Windows), get the appropriate ASIO driver. It bypasses the Windows audio subsystem and uses its own.
 

Eduardo96

New Member
I'm not referring to the audio system. I was talking about the latency generated by the Digital to analog Converter (DAC) that most video cards have. That DAC is what converts the digital data from the GPU and output it by the VGA out (to analog monitors). All converters have a conversion time, and that's the latency I mean. That's not a problem in digital displays because the signal is not converted from digital to analog.
 
Top