PCI-Express question (x1, x16 and adjusting)

Levone

New Member
Ok so I never quite understood the PCI Express workings. I get that an x1 is slower than an x16 and that you can have a slot physically large enough to fit an x16 but the slot is only rated at x8. But can an x16 go into an x1 slot?

VidCardExample.jpg


For instance, where the arrow is pointing at, would that fit into just an x1 slot and the card would down-adjust itself accordingly? Reason I ask is that I'm trying to find a secondary video card to run triple monitors (a cheaper option than buying a new video card of this generation that already allows for such) and it's ridiculously hard to find either a PCI or PCI-E x1 Radeon card, so I was wondering if the gap between the two slits on that card was to allow it to fit into an x1 slot.
 
yes a x16 compatible device can go into an x1 slot given 1 thing. The pcie x1 slot must be notched out at the back to allow clearance for the x16 card.
 
yes a x16 compatible device can go into an x1 slot given 1 thing. The pcie x1 slot must be notched out at the back to allow clearance for the x16 card.
Are you sure, I always though that it went this way:

-X16 slot can take x8, x4, x1, and x16 cards
-x8 slot can take x4, x1, and x8 cards
-x4 slot can take x1 and x4 cards
-x1 slot can take x1 cards
 
aslong as you can get the card to fit and it's the same pcie standard, it will work. It will just work with reduced performance since there aren't enough available lanes. PCIE is downwards compatible.
 
Yea, it can be done. It's alright to even literally saw open the end of a x1 connector and plug a x16 card in there, it will work provided you don't actually damage the connectors, the only thing is that if it's a higher powered card it will be bottlenecked by the limited bandwidth of a single lane. As NC said, if it's the same standard, it will work; even if it isn't the same standard, it SHOULD work (there were forward compatibility issues with certain mainboards or chipsets, IIRC it was to do with VIA chipset but my memory is crapping out).
 
No, all you have to do is break the plastic out of the backend of the 1x PCIe slot so that it no longer blocks the 16xPCIe GPU from seating in the slot. If you're afraid of doing that for whatever reason, then that adapter is necessary.
 
No, all you have to do is break the plastic out of the backend of the 1x PCIe slot so that it no longer blocks the 16xPCIe GPU from seating in the slot. If you're afraid of doing that for whatever reason, then that adapter is necessary.
Wow, if only I had knew.... *pictures self as supreme dictator of everything*:D
 
Another option that can help prevent damage to the connectors as well as messing up the board surface and any nearby traces is using the tip of a hot knife or soldering iron to literally melt away the back of a pcie x1 slot, or even an exacto knife(but will take a good amount of time using an exacto knife).
 
Hmmmm....decisions decisions...

Well in the end it's still more practical than finding an x1 or PCI card, which seem to have become all but extinct in the wake of x16 and those that are still around have atrocious ratings.


So will an x16 run 16x slower in an x1 slot? Will the bottleneck even be noticable if all that card will be doing is running a third monitor that's just used to surf the web/type documents (not gaming or anything, though it will run videos I'm sure)? I know the answer to that is 'depends on the card' but in general with, say, a mid-range card.
 
So will an x16 run 16x slower in an x1 slot?
No. It depends entirely on the card - if you get a budget card like the older 8400GS, GT210 or ATI equivalents, the bottleneck will be minimal to nonexistent. However, the better the card you get, the more bandwidth it'll need, and the more the bottleneck - you'll reach the point of diminishing returns pretty quick, where getting a significantly faster card may only yield marginal performance increase simply because it doesn't have the bandwidth.

EDIT: However, if the card isn't supposed to handle any gaming, the bottleneck is really irrelevant. It'll still be fine for a multi-monitor setup.
 
I would have thought something like this may be more sensible?
http://cgi.ebay.com.au/ATI-Radeon-9...iewItemQQptZAU_Components?hash=item414c0e92de

PCI vga cards always had attrocious ratings. What do you expect? However if you are simply running 2D on a third monitor, it will work fine.

However if it were me I would save the $50 and put it towards a new platform.
Two things about that auction:
1. The Radeon 9200 is SO SLOW THAT YOU WILL PULL YOUR HAIR OUT!
2. That auction is from a seller in Hong Kong. I try my best to never buy from Hong Kong, because I know from experience, that they will sell you the cheapest and sometimes not even legit product out there.
Anyways, if he has a PCI-e x1 slot then he should buy a pci-e card, since in almost all scenarios pci-e cards cost about half as much as PCI cards that have the same GPU in them.
 
Back
Top