0

Hello,

I'm running dual monitors (have been for probably close to seven years now) and I recently got a new video card. My old card had dual DVI ports, but this new one has one DVI and one VGA. That alone, kind of caught me off guard, but I figured I'd just buy an adapter and call it a day.

Turns out though, the monitor plugged into the DVI port works fine. The other one flashes as if it's in sleep mode. However, when I unplug the cable (with the adapter in use) it presents the "no signal" error.

Am I correct in assuming the monitor is getting a signal and it's only a matter of some configuration? Or is it something else? What should I do?

Thanks in advance,
Chris

Edit: I have used the control panel > display properties to enable and expand the desktop to the other monitor to no avail.

3
Contributors
2
Replies
3
Views
8 Years
Discussion Span
Last Post by comp-noob
0

have you set the adapter to a resolution supported by the monitor at a supported frequency? Like 1280 x 1024 @ 60Hz?

My first instinct would be configuration or bent VGA cable pins.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.