Black Border around Desktop when using HDMI Output

double b26

New Member
for starters, im on my dell studio 1537, which has an ATI mobility radeon 3450 graphics card. i did a clean install to windows 7 a couple weeks ago, and since then i've had this problem.

the problem is that whenever i run hdmi out to any source (ive tried a couple tv's and a monitor), there is a black border all the way around the desktop. basically, the picture doesnt fill the entire screen. when i go to the resolution settings, it shows that its at maximum (1920 x 1080). back when it was running vista, it worked fine though.

so far ive tried a few different versions of the ati driver and catalyst control center. ati just released a new version this past tuesday (11/17), and it didnt help either.

so at this point, im pretty much out of ideas. if any of you have anything i could try, id like to hear it. thanks.

oh yeah, i've tried hooking up to the tv with vga and it does run full screen.
 

The_Other_One

VIP Member
Is there an overscan option that can be selected somewhere within the output settings? It's more of a feature for standard TV out, but I know some HDMI outputs are technically "overscanned".
 

zombine210

New Member
what you're getting is called underscan/overscan. not sure exactly how that happens i think it's a combination of hdmi cable + tv monitor.

basically, your computer thinks its a tv. or something like that.

don't quote me, i've been drinking.
 

double b26

New Member
Is there an overscan option that can be selected somewhere within the output settings? It's more of a feature for standard TV out, but I know some HDMI outputs are technically "overscanned".

i read about that option somewhere else in my research, and no, its not an option with the control center its running now.

one thing i've noticed though... the catalyst control center that i'm running now seems to have less options than the one that was on vista. i cant put my finger on it exactly, but it just seems that there are is less depth of options/settings available. i might try finding which version was on vista, and if its 64-bit compatible, and try installing that one... if i can find it.

what you're getting is called underscan/overscan. not sure exactly how that happens i think it's a combination of hdmi cable + tv monitor.

basically, your computer thinks its a tv. or something like that.

don't quote me, i've been drinking.

thanks for the input, but i dont think its a problem with the cable or tv(s). it worked fine with the same cable and tv when it was running vista. i have connected to many tv's, monitors, and projectors while running vista, but never had a problem with any of them. this has only become an issue since i did the windows 7 install. plus, since the problem arose, ive tried it with two hdmi cables, 2 tv's, and one 1080p monitor... all with the same result.
 

cajuntexan00

New Member
I bet a driver update for your video card with windows 7 comes out soon that fixes this problem. You've clearly ruled out a hardware problem with all your trial and error.
 

zombine210

New Member
thanks for the input, but i dont think its a problem with the cable or tv(s). it worked fine with the same cable and tv when it was running vista. i have connected to many tv's, monitors, and projectors while running vista, but never had a problem with any of them. this has only become an issue since i did the windows 7 install. plus, since the problem arose, ive tried it with two hdmi cables, 2 tv's, and one 1080p monitor... all with the same result.

hdmi is a tv cable, 'high definition' is for 1080 resolutions. when your computer detects a hdmi cable, it will act stupid. have you tried playing a movie? do you still get the black border?

do any of your monitors have vga or dvi ports? try using those, if you are still getting a black border, then it's a driver/gpu issue; if you don't get a border using vga or dvi, then windows 7 is in 'tv' mode.
 

double b26

New Member
I bet a driver update for your video card with windows 7 comes out soon that fixes this problem. You've clearly ruled out a hardware problem with all your trial and error.

yeah, that's what i'm thinking too. i just though this may be conquered territory already, but i guess not.

hdmi is a tv cable, 'high definition' is for 1080 resolutions. when your computer detects a hdmi cable, it will act stupid. have you tried playing a movie? do you still get the black border?

do any of your monitors have vga or dvi ports? try using those, if you are still getting a black border, then it's a driver/gpu issue; if you don't get a border using vga or dvi, then windows 7 is in 'tv' mode.

yeah, i've tried movies on fullscreen and nothing changes.

vga input? sure do, and like i said in post #1, it does work properly when i connect any source via vga. it just sucks because vga doesnt support audio, so that means another cable to carry with me, and another connection to make.

but actually, i tried it on a different tv than i had ever before. this is an older set that only supports up to 720p, and 1080i i think also. when i connected via hdmi to this set, it came up full screen (actually a little over-sized). however, the resolution was 1024x768 (or whatever 720p is). now, when connected to a 1080p set, this is not an option. and even though it was running full screen on the 720p set (which i think is a Sylvania brand), the picture looked pretty bad for movie watching.

which leads me to another thing. it seems to me that even with the border around the screen, the picture quality just doesnt seem as good as it used to be. i have nothing but my eyes to back me up on this, but it just doesnt seem quite as clear.

anyway, thanks for all the responses. unless someone knows how to fix it, i'm just assuming that its a driver issue. i'll just have to wait it out, i guess. it just seems that if this were a wide spread unsolvable problem, they would have addressed it in their last 'official' release, which as only a week or so ago.
 
Last edited:

scanlia

New Member
Fixed!!!!

I've had the same problem... I fixed it by using a VGA cable instead of HDMI, it works perfectly now !!!!
 

pencilman

New Member
OMG, I'm not alone.... Sorry I don't have a solution... I just put together a new system (including new video card and monitor). My video card is XFX Radeon HD 5770, and my monitor is Samsung P2370HD (23-inch, Full 1080p (1920x1080)). I'm having the same issue.

With HDMI connection I have a black border around the entire screen, and poor image quality (slight distortion). It's very easy to notice the quality because in my troubleshooting I hooked up the DVI connection as well and can easily switch back and forth from HDMI to DVI. The DVI connection by the way is perfect (infact, I would say beautiful...), with no distortion and no black borders. Anyway, I tried an older driver that came with the card as well as the latest one on ati.com, but still the same problem. I've been messing with settings for 4 days. I've tried every display mode and option in the ati software (catalyst). I haven't spoken with XFX yet because they seem to be not taking calls until January 4th...nice huh?...

BTW, I also hooked up my Playstation 3 to the monitor via HDMI, and the image seems perfect, no distortion or black borders. It definitely seems like an issue with the video card or software for the card. Anyway, I'll keep you posted. Please people, don't hold back if you hear anything...

On a side note, besides this one issue (which is hugely disappointing), I still think this is a pretty good video card. For the price < $200, I can't find one better with low power, 1GB gddr5, directx 11, up to 3 monitors, performs pretty well on the bench, life-time warranty with XFX, and looks awesome. That's why I haven't exchanged it yet.

Thanks.
 

double b26

New Member
UPDATE: i got it fixed. turns out that dell just doesnt have there act together. they sell other laptops with the same video card, and list a windows 7 64-bit driver for them, but not mine. go figure. anyway, i downloaded and installed the driver from another pc's list, installed it, and it works fine again.

@ scanlia,
using vga already worked right, but that's not a fix, its a work-around. i wasnt trying to work around it, i wanted it to work like it was supposed to. vga is okay, but not the best looking on 40"+ tv, plus it doesnt pass audio with it, so you have to rig something up for the sound too. a PITA, especially when the hdmi should do its job.

@ pencilman,
you might look around at manufacturers sites (like dell, hp, etc) to see if any companys sell rigs with your card in them. if so, you might be able to find a driver on their website that will work for you. good luck!
 
Last edited:

delsey

New Member
It took me a while to fis this (even though I had to fix it in Vista) as ATI have hidden the menu so well:

  1. Switch to a lower resolution than native (e.g. 1280x720)
  2. Open Catalyst Control Centre
  3. Select Graphics (top left), Desktops and Displays
  4. Click on the little black trangle in the top left of the SMALL (repeat SMALL) display icon at the bottom of the window
  5. Select Configure
  6. Enable GPU Scaling
  7. Click on the 'Scaling Options' Tab
  8. Slide the Under/Overscan to 0%
  9. Apply the settings
  10. Switch back to full resolution (1080p)
Hope that helps
 

tlarkin

VIP Member
Download the newest Catalyst control center for your ATi card and set up over scanning. Done.

I run HDMI from my HTPC (linux box) into my receiver, along with every other device into my receiver over HDMI and then one HDMI out to my TV and I have zero problems. Looks great, outputs 7.1 audio and I only gotta use one cable instead of it being a war-zone of cables.

It is not hard to do.
 

delsey

New Member
It took me a while to fix this (even though I had to fix it in Vista before I upgraded to Windows 7) as ATI have hidden the menu so well:

  1. Switch to a lower resolution than native (e.g. 1280x720)
  2. Open Catalyst Control Centre
  3. Select Graphics (top left), Desktops and Displays
  4. Click on the little black trangle in the top RIGHT of the SMALL (repeat SMALL) display icon at the bottom of the window
  5. Select Configure
  6. Enable GPU Scaling
  7. Click on the 'Scaling Options' Tab
  8. Slide the Under/Overscan to 0%
  9. Apply the settings
  10. Switch back to full resolution (1080p)
Hope that helps

Sorry reading my own post I realized it was a bit lacking in background.

I just upgraded to Windows 7 64bit on my Dell desktop - I have a Radeon 3450 card outputing over HDMI to a Sharp 42inch LCD TV. With the resolution set to 1920x1080 I had a 1 inch black box around the screen and the text was clearly pixellated. I have previously fixed this problem in Vista finally tracking it down to the default in the ATI Catalyst Control Centre being 8% underscanning for HDMI out (anyone from ATI/AMD want to comment on why this is good idea?).

The issue I had with Windows 7 was that I uninstalled and reinstalled the ATI software (as recommended by the Window 7 compatatbility check) and I could not find the HDMI/scaling options as they are 'hidden' behind a little black triange on an icon (on a screen with another larger 'decoy' icon! - anyone from ATI want to comment on why that seemed like a good idea). So the instructions posted above are how to find the setting in Catalyst control panel (the hidden black triange) and how to un-gray the Scaling settings (reducing the resolution). Hope that helps! :)

Dell Inspiron Desktop
Intel Core 2 Quad Q8200
4GB
Windows 7 64 bit
ATI Radeon 3450 PCIe with Catalyst Control Centre 10.4
 

double b26

New Member
just wanted to let everyone know that i did finally figure this one out. someone at the dell forums turned me on to the video driver for this card, but in the studio 1555 (i think it was). turns out that dell has the proper windows 7 driver listed for that model, which uses the same graphics, but doesnt have it listed for my 1537... or at least they didnt a month or so ago. it works like it should again.
 

Chaddo

New Member
It took me a while to fis this (even though I had to fix it in Vista) as ATI have hidden the menu so well:

  1. Switch to a lower resolution than native (e.g. 1280x720)
  2. Open Catalyst Control Centre
  3. Select Graphics (top left), Desktops and Displays
  4. Click on the little black trangle in the top left of the SMALL (repeat SMALL) display icon at the bottom of the window
  5. Select Configure
  6. Enable GPU Scaling
  7. Click on the 'Scaling Options' Tab
  8. Slide the Under/Overscan to 0%
  9. Apply the settings
  10. Switch back to full resolution (1080p)
Hope that helps
Thank you very much for your help there Delsey! Got my black border issue sorted out thanks to you!

Had to download the catalyst control centre (it wasn't supplied with my XFX HD 4350 graphics card(only the drivers were)), then adjust some of the settings you recommended.

The image filled the entire screen on the dsub output but on dvi the image wasn't filling the screen....... but it's happy days now!!
 
Top