SirKenin was right, there is no native way to cluster windows machines, you must use third party.
Clustering is good for servers, because not only will it run in parallel processing mode but also distributed processing mode. Apple has a server side technology called xgrid which will farm out data to windows and OS X clients and the data is crunched and then sent back. There is a third party client you install on windows to accomplish this, and there is also a client built into OS X. I have looked into it but well, I have way too much on my plate to even think about implementing it any time soon, plus we would need a slight infrastructure overhaul.
here is an example of how it is applied in real world situations
http://www.apple.com/education/profiles/louisville/
However, I think most of you aren't grasping that your application has to support it. No end user application (or game) support this. Think about it, you cluster say 3 mid range currently out desktops on a gigabit ethernet. You run an application and it distributes all jobs and tasks amongst the three computers, by the time the data gets sent, crunched and sent back you are most likely losing performance because it could be done faster locally. The FSB on your new computer is faster than gigabit ethernet, and well unless you are running a high end switch you are getting a plethora of packet loss and overhead from your switch being cheap.
So, really you would not benefit anything from running a cluster at home, unless you really knew what you were doing and could develop your own applications to distribute jobs over like a beowulf system or something of the like.