plutoniumman
New Member
I know all CPUs have their limit to how high they can be overclocked. Like temperature, quality of electricity... ...And is that it?
I could see how noise on the electrical input could mess-up the chips (thanks collage & Parallax!), but why is heat a limiting factor? Obviously one reason why is because it’ll literally melt/burn the chip or board or etc., if there isn’t sufficient cooling. But why does it cause more errors?
I noticed when I overclock the VRAM in my GPU, it starts producing artifacts/errors. Is the reason a CPU can’t overclock too high, is because the L1 & L2 cache start producing errors, and can’t ‘keep-up’? (L1 is generally the same frequency as the CPU’s main clockspeed; more overclock = faster cache)
I wish my collage course covered overclocking better... Er, maybe I should’ve paid better attention...
I could see how noise on the electrical input could mess-up the chips (thanks collage & Parallax!), but why is heat a limiting factor? Obviously one reason why is because it’ll literally melt/burn the chip or board or etc., if there isn’t sufficient cooling. But why does it cause more errors?
I noticed when I overclock the VRAM in my GPU, it starts producing artifacts/errors. Is the reason a CPU can’t overclock too high, is because the L1 & L2 cache start producing errors, and can’t ‘keep-up’? (L1 is generally the same frequency as the CPU’s main clockspeed; more overclock = faster cache)
I wish my collage course covered overclocking better... Er, maybe I should’ve paid better attention...