First time poster, but I did make an effort searching for the issue on this forum and google in general. Only found one relevant thread - no final answer.
OK, My last system go boom so I just got a new one, but I used the X1950XT from my older system (didn't have the need or the money for a new card). On the new machine I installed Vista x64 (mostly so I could use 4GB of RAM) and then the newest graphic drivers. I then let the installer put the latest directX on it. It now reports DirectX 10 in dxdiag. But I'm pretty sure that my card doesn't support directx 10 so how can this be?
So far no problems, nothing exploded. In that one unfinished thread the guy with a similar situation did have problems. I haven't had a chance to test the thing with a next-gen game, and I will report my findings when I get a chance to do so, but I'd just like to know how something like this is even physically possible?
OK, My last system go boom so I just got a new one, but I used the X1950XT from my older system (didn't have the need or the money for a new card). On the new machine I installed Vista x64 (mostly so I could use 4GB of RAM) and then the newest graphic drivers. I then let the installer put the latest directX on it. It now reports DirectX 10 in dxdiag. But I'm pretty sure that my card doesn't support directx 10 so how can this be?
So far no problems, nothing exploded. In that one unfinished thread the guy with a similar situation did have problems. I haven't had a chance to test the thing with a next-gen game, and I will report my findings when I get a chance to do so, but I'd just like to know how something like this is even physically possible?