I have struck out getting help from the video card maker and the monitor maker, so I am turning here for help.

I am considering buying a Viewsonic monitor (the VA2226W) whose recommended resolution is 1680 x 1050. I have an older (and discontinued) video card, an ATI All in Wonder Radeon 7500 AGP (64MB) that does fine supporting my current monitor, which is 1280x1024 resolution.

When I look at my old card's specs at: http://ati.amd.com/products/radeon75...500/specs.html
I am hopeful, because it says the list of supported resolutions is only a range and that "other selections are available."

But obviously I'd rather know in advance that my card will work!

A related question: customer reviews regarding the monitor I'm interested in (user guide is at: http://www.viewsonic.com/pdf/usergui...w-1_ug_eng.pdf) recommend using a DVI cable instead of the VGA, which is the only cable supplied. Can anyone please explain the difference and why DVI, which costs extra, apparently is superior?

Thanks in advance.

Recommended Answers

All 3 Replies

dvi is digital

lcd monitors are digital and graphics cards are too

therefore vga is worse quality becuase it is converted to analog when there is no need

Thanks.
Makes you wonder why "VGA" is even in the room.

Still looking for help on the compatibility question.

because CRT screens work best with an analog signal

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.