From what i hear quantum computing will abositely blow away todays tech, prob more so than optical tech.
I have heard conflicting statements about this. With the first working prototype using phototonics which IBM made a few months back, it was a hell of a lot smaller in terms of transistor count (I am aware it isn't actually transistors as such because it uses light rather than electronic signals) but they claim it was, in terms of size, thousands of times more powerful than a standard electronic CPU, which is understandable because you are using the speed of light and no resistance, compared to electronic signals and the resistance of the wire.
But then I've heard similar things from quantum computers.
Until both become a viable option and you can actually see a full, working prototype computer which houses one of these, it is all talk though really because all companies funding the R&D will keep the actual results and research on a need to know basis. After all, the first to exploit such technologies will be a very, very rich company...
For the original question though, the max clock speed would be, theoretically, infinite, within the bounds of the laws of physics, so you couldn't make more electrons flow than would be physically possible down the size of the wire, you wouldn't be able to give them infinite amounts of energy etc. It is because of these, also with the factor of hardware limitations, so the distance signals have to travel around the mobo to other components, maximum power thresholds, even down to impurities in the copper used, will pin you back.
You look at some CPU's and they will run fine with 1V, but then for others you will have them sat at 1.6V at the same clock speed because of the different architectures. In an ideal world, the architecture would allow for massive clock speed, a huge amount of cores and very little power to be put in, but the chips we have are actually very inefficient, but they are the best we have so far