|
Answer» So my old computer was hooked up to an LCD display via DVI and to a regular TV via S-video out. I had it SET up just FINE, with the LCD as the primary display and the TV as the secondary, and I had it as an extended desktop. Whenever I turned on the computer, everything was as it should be. When I turned on the TV and went to the S-video input, the computer would automatically detect the TV and the extended desktop would show up.
I now have a new system. On-board video card (Gigabyte GA-G33M-S2H motherboard with Intel G33/G31 Express CHIPSET). I also have a new TV. Now, the LCD (same one) is attached via a VGA cable and the TV is attached via an HDMI cable. Same version of Windows (XP Media Center).
Everything still works, however, when I go to the HDMI input on the TV, the extended desktop doesn't automatically show up on the television screen. However, it DOES detect the TV because I hear the default "detect" / "usb" .wav sound file. So what I have to do to get it to work is manually go on Display Properties --> Settings --> Click on the 2nd display (the TV) ---> UN-check the "extend my windows desktop" box --> and then RE-check it --> press apply ---> ...and then voila, the image pops up onto the TV, as it should automatically. Once I do this, I am fine. I can put the computer on standby, I can go back to the TV tuner and then back to the input--whatever--the extended desktop will remain on the TV screen. However, as soon as I turn off the TV and turn it back on again, the image is gone and I have to manually set this up all over again.
How can I fix this? I never had to do this on my old system... I've also got the most up-to-date video drivers. Didn't help.
Thanks for any help.This is a somewhat annoying problem. I don't have a whole lot of experience using multiple monitors, but I have used it on a few PCs, and I use it on my home PC.
The problem is caused because the PC doesn't recognize the TV when it is initializing the video. I've ALWAYS noticed with TVs (S-Video or HDMI) if I plugged in the TV (or turned it on in your case) after the computer was booted it would FAIL to work properly right away. I always had to make sure I rebooted the PC while everything was plugged in, or screw with the settings. It has never been an especially annoying thing because I don't have to do it often.
With one customer, his TV always was detected even when it was turned off, so he didn't have any problems as long as he didn't unplug the HDMI cable. With you on the other hand it sounds like the computer doesn't see it until you turn it on. Everytime the second display (your TV) is removed from the system it is messing up the settings for your extended desktop because it no longer has a display to extend it to. The secret would be to make it always see that second display regardless of the status of the TV. Have you by any chance installed a driver for the TV itself?
I understand your frustration, but I'm pretty sure, from my experience, that Windows XP has some pretty buggy multi-monitor support.
|