Non-directX 10 card reports DirectX 10 to be installed

Jedo

New Member
First time poster, but I did make an effort searching for the issue on this forum and google in general. Only found one relevant thread - no final answer.

OK, My last system go boom so I just got a new one, but I used the X1950XT from my older system (didn't have the need or the money for a new card). On the new machine I installed Vista x64 (mostly so I could use 4GB of RAM) and then the newest graphic drivers. I then let the installer put the latest directX on it. It now reports DirectX 10 in dxdiag. But I'm pretty sure that my card doesn't support directx 10 so how can this be?

So far no problems, nothing exploded. In that one unfinished thread the guy with a similar situation did have problems. I haven't had a chance to test the thing with a next-gen game, and I will report my findings when I get a chance to do so, but I'd just like to know how something like this is even physically possible?
 
It just has 10 installed but your card is running 9c. Its no more different if you have directX 9c installed with a directX 8 card, it still shows 9c but its running in DirectX 8. I have Vista installed on a computer in the other room and it shows 10 but it has a old 9600 Pro ATI card.
 
As mentioned, older cards can run newer versions of DirectX, but may not support all the features. I know my old Radeon 8500 was only DirectX 8.1, but I could run DirectX 9 without a problem.
 
Oh ok so it's common then, the vid card chooses "how much" of the directX it will run...I tested out Armed Assault and the whole PC hung with a black screen 2 minutes into the game, but I guess that's caused by something else then.
 
If you have Vista you will have DirectX 10 installed, no matter what video card you are using. That however is only software, your video card itself needs to support DirectX10 in order to play DX10 games or to take advantage of DX10.
 
Back
Top