How do you judge the performance of a video card?
I was looking at the graphics cards Aastii posted, and the one thing that was more appealing about the alternate card was the hdmi (not mini hdmi) port.
The level of performance is measured in fps (frames per second). It is the rate at which it calculates the data required for each frame. 60fps is the max you will notice on any standard monitor, because most have 60Hz refresh rate, which means your monitor changes picture 60 times every second.
As you raise the settings, be it quality, resolution or features such as anti-aliasing and anisotropic filtering, the graphics card obviously has to perform more calculations, so fps, should, drop. Depending how well it is able to perform at higher settings, and at what settings it can handle in most games whilst still maintaining decent frame rates (atleast 30 fps would be acceptable fore most people) is the way you tell how good the card is.
There are other things to take into account that don't directly relate to performance, but could affect the decision, such as power consumption, size, temperature, extra features (DX11 for instance, or Physx for Nvidia, Eyefinity for ATi etc)
If it is within the same family, you can use the numbers to tell which card is better than another. The first number is the family (2xx, 4xx, 5xx etc for Nvidia, 4xxx, 5xxx, 6xxx etc for ATi), then the following numbers, the higher the better.
Across families though, you will have to look at benchmarks. Just because a GTS450 is a higher umber than the 280 doesn't make it a better card, in reality the 450 is sat in between a GTS250 and a GTX260.
Just type into google "<card> review" for instance, for the GTS 450: "GTS450 review" and it will come up with:
http://www.google.co.uk/search?sourceid=chrome&ie=UTF-8&q=gts450+review
the first result show the performance pretty well compared to other cards:
http://www.bit-tech.net/hardware/graphics/2010/09/13/nvidia-geforce-gts-450-review/1