i just got an ATI radeon 9200 Se graphics card and i wanna know some things...
1. when u place it in the computer slot, is that all?
2. the cable that came with it does not fit the outlet on the plug itself!!?? wtf?

i'd appreciate the help guys..thx

Recommended Answers

All 10 Replies

If you purchased your ATI Radeon graphics card new, you should have gotten a "User's Guide" which tells you all you need to know regarding how to install and use the card. If you didn't get the guide, go here:
https://support.ati.com/ics/support/default.asp?deptID=894

This guide will tell you how to install the card, the software, the drivers, and how to connect it all to your monitor.

the guide just tells me to remove an AGP slot and insert the card, but i noticed on the metal cover for the card that there was a slot that said CRT (i guess for the monitor outlet) but there is no crt outlet in the box?? whats this all about

Evidently, this particular model of graphics card ships without any cables per the Mfr's site. All you get is a book, a CD and the card. See: http://shop.ati.com/product.asp?sku=2335507&section_id=9

Documentation

Software

Hardware & Cables

  • RADEON® 9200 128MB AGP Graphics Card

The card has two connectors on the edge. One is a 15-pin CRT connection and the other is S-Video. The monitor usually ships with a cable or two to match their inputs. You most likely will have to use your old cable, or purchase one that's compatible. I don't think you'll find one at ATI's site. Older CRT monitors have their video input cable hardwired into the monitor and the other end is a 15-pin connector. More modern montors which have dual inputs (analog & digital) ship with both interconnecting cables. You'll have to look to the monitor manufacturer for a cable if it's not hardwired to the monitor and you don't have it.

yeah....my monitor is a samsung syncmaster 753df, i dont know if its crt or w/e...but the issue with the cable is, the outlet on the card has a 24 pin input then 4 more beside it (on the same outlet) in sort of a cross shape, i think this is just stupid to have a cable come with it that doesnt fit

OK. Let's define what connectors your card has.

[img]http://www.marketworks.com/hi/57/57246/ati_9200_2.jpg[/img]
The blue colored 15 pin connector is used to hook the card to an analog monitor (typically a CRT).

The round black connector (in the middle) is used to route "S Video" to a TV monitor.

The white connector with the multipins is your DVI port. This is the output to a digital monitor such as an LCD flat screen monitor.

If you purchased this card new in the box, you most likely did get one cable with it - the "S Video" cable for use with a TV which fits the middle connector on the back of your card. But, again, you have to use the cables that come with the computer monitors.

ok, thats pretty much what the front of the card looks like, except there IS NO blue connector, theres just the far left one, and no plug that i have fits it

If your card has ONLY the white multi-pin connector on it, then it has only a digital output interface and cannot be used with an analog CRT monitor. If what you have is an analog CRT (the video cable coming out of your CRT monitor has a 15 pin plug on the end), then this card will be useless to you. You should either take it back for exchange/refund, or buy a digital monitor that you can use it on. Sorry!!!

If your card has ONLY the white multi-pin connector on it, then it has only a digital output interface and cannot be used with an analog CRT monitor. If what you have is an analog CRT (the video cable coming out of your CRT monitor has a 15 pin plug on the end), then this card will be useless to you. You should either take it back for exchange/refund, or buy a digital monitor that you can use it on. Sorry!!!

That is not correct. All t-hill666 needs is a DVI to VGA converter. Most video cards that have DVI-I plugs come with them, but some don't unfortunately.

I stand corrected - my mistake. I did some more homework after reading chrisbliss18's comments above. He is correct. ATI states that an adapter may be used on the DVI output to connect to an analog monitor. But they go on to say that the use of an adapter or extension cable may cause display artifacts or horizontal lines in some circumstances.

OK. Let's define what connectors your card has.

[img]http://www.marketworks.com/hi/57/57246/ati_9200_2.jpg[/img]
The blue colored 15 pin connector is used to hook the card to an analog monitor (typically a CRT).

The round black connector (in the middle) is used to route "S Video" to a TV monitor.

The white connector with the multipins is your DVI port. This is the output to a digital monitor such as an LCD flat screen monitor.

If you purchased this card new in the box, you most likely did get one cable with it - the "S Video" cable for use with a TV which fits the middle connector on the back of your card. But, again, you have to use the cables that come with the computer monitors.

As far as my thinking goes, a typical (blue) monitor jack for RGB is what I refer to as VGA, though these days it's progressed from S VGA and XGA, etc. Then there's the standard S-video (not SVHS based, BTW, a pet peeve of mine, but short for seperated video (Y/C)), and the new DVI-I output as mentioned, mine was bought to support this and is the Radeon 9250, these are budget cards as I understand it and meant to be lower end in performance. At $200, you are still in the typical budget card range, sadly
It's alot of money I'd already hoped was included with the newer PCs. bummer

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.