0

I have struck out getting help from the video card maker and the monitor maker, so I am turning here for help.

I am considering buying a Viewsonic monitor (the VA2226W) whose recommended resolution is 1680 x 1050. I have an older (and discontinued) video card, an ATI All in Wonder Radeon 7500 AGP (64MB) that does fine supporting my current monitor, which is 1280x1024 resolution.

When I look at my old card's specs at: http://ati.amd.com/products/radeon75...500/specs.html
I am hopeful, because it says the list of supported resolutions is only a range and that "other selections are available."

But obviously I'd rather know in advance that my card will work!

A related question: customer reviews regarding the monitor I'm interested in (user guide is at: http://www.viewsonic.com/pdf/usergui...w-1_ug_eng.pdf) recommend using a DVI cable instead of the VGA, which is the only cable supplied. Can anyone please explain the difference and why DVI, which costs extra, apparently is superior?

Thanks in advance.

2
Contributors
3
Replies
4
Views
9 Years
Discussion Span
Last Post by jbennet
0

dvi is digital

lcd monitors are digital and graphics cards are too

therefore vga is worse quality becuase it is converted to analog when there is no need

0

Thanks.
Makes you wonder why "VGA" is even in the room.

Still looking for help on the compatibility question.

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.