Hello,

I'm running dual monitors (have been for probably close to seven years now) and I recently got a new video card. My old card had dual DVI ports, but this new one has one DVI and one VGA. That alone, kind of caught me off guard, but I figured I'd just buy an adapter and call it a day.

Turns out though, the monitor plugged into the DVI port works fine. The other one flashes as if it's in sleep mode. However, when I unplug the cable (with the adapter in use) it presents the "no signal" error.

Am I correct in assuming the monitor is getting a signal and it's only a matter of some configuration? Or is it something else? What should I do?

Thanks in advance,
Chris

Edit: I have used the control panel > display properties to enable and expand the desktop to the other monitor to no avail.

Recommended Answers

All 2 Replies

have you set the adapter to a resolution supported by the monitor at a supported frequency? Like 1280 x 1024 @ 60Hz?

My first instinct would be configuration or bent VGA cable pins.

Also check if your new will support the dual screen setup

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.