I have a GTX 260, it is about 1 year old at the moment and I can play any game pretty much at Max settings. I even ran Crysis near max settings. The funny thing is, the Crysis sequel ran way crappier than the original.
Over the years I have switched back and forth between Nvidia and ATI, starting back in like 97 when I switched from my Voodoo card to a Riva TNT 16MB AGP card. Back then that card was hot stuff!
Over the years I find them both comparable, and I don't really hold benchmarks as the gospel. Benchmarks do not really reflect on real world performance, especially in the build your own PC world. Since everyone's custom PC is going to be slightly different in configurations, it could very well slightly alter benchmark scores. Benchmarks and system performance are very subjective, because there is no "base" PC model to make comparisons to. That is the problem with all the articles on benchmarks, there is no control system. Maximum PC used to do a control system which was the average computer that year, and they could compare all benchmarks to that. Any time you run any sort of comparison and want to really look at progression and results you need some sort of control system to base it off of.
That being said, in my personal opinion, over the last 10 years, Nvidia has made better drivers. I just don't like the catalyst driver and all in one application crap ATI does. It just gets annoying on certain levels. I run TV out on my ATI card I currently have and if I download and update the driver it kills all TV out settings unless I upgrade the whole Catalyst suite. Even if I do a full upgrade it NEVER keeps my settings at all.
That being said, I would typically go with Nvidia at this point in time if I were building a system today.