I got a USB video card to give my development computer another monitor without getting a new video card. I started getting a headache after thinking about how they work...

The product (as advertised) uses USB 2.0 and can output 2048x1152 resolution with 32 bit color at 60Hz. A bit of math to figuire out the throughput...

2048 x 1152 = 2359296 pixels
2359296 x 32bits per color = 2359296 x 4 bytes per color = 9437184 bytes/frame
60Hz x 9437184 bytes/frame = 566,231,040 bytes per second = 540 MB/second
USB 2.0 throughput is rated at a theoretical maximum of 40MB/second.

Anyone else see the lapse in logic here? I thought maybe the packaging lies to me and it's not actually 60Hz, but to get even close to the 40MB/second max of USB 2.0 that would mean the refresh rate is only about 5Hz which would be absolutely horrible, and this does not seem to be the case. I also thought maybe it uses some type of compression - but this would essentially make high definition images look like lossy JPEG images and this is also not the case - the images look very clear.

My inner nerd is killing me here now - this device seems to have some magical properties that are not explainable. Google hasn't given me any explanations, maybe one of you can?

Recommended Answers

All 3 Replies

I know not what they do. I know what I could do.

First, the 60Hz refresh rate is the CARD OUTPUT - the device has its own memory. The data transmitted over the USB is stored in this memory. This is what every video card does, since the days of Apple I and IBM CGA, Hercules, etc. This memory is read at 60Hz to produce the video signal.
As long as the CONTENTS of the screen does not change, no data passes over the USB.
This is how VGA had 640X480X32-bit mode, 50 Hz, on the time that PC memory could not be written, PHYSICALLY, at more then 5 MB per second.

Second, You does not transfer the whole screen' contents over the USB. Only the changes. This is how every remote-control program operates.
I did use Remote Control programs, looking the actual VGA screen, using a 28 kilo-baud telephone modem ( that is 2800 bytes per second, at best).

Hope this helps you out.

I know not what they do. I know what I could do.

First, the 60Hz refresh rate is the CARD OUTPUT - the device has its own memory. The data transmitted over the USB is stored in this memory. This is what every video card does, since the days of Apple I and IBM CGA, Hercules, etc. This memory is read at 60Hz to produce the video signal.
As long as the CONTENTS of the screen does not change, no data passes over the USB.
This is how VGA had 640X480X32-bit mode, 50 Hz, on the time that PC memory could not be written, PHYSICALLY, at more then 5 MB per second.

Second, You does not transfer the whole screen' contents over the USB. Only the changes. This is how every remote-control program operates.
I did use Remote Control programs, looking the actual VGA screen, using a 28 kilo-baud telephone modem ( that is 2800 bytes per second, at best).

Hope this helps you out.

Thanks for your input. But I am not sure if this entirely answers my question.

I can play high definition video on the display where the image is changed constantly. Albeit not every pixel changes - maybe I will try writing a program to fill it with random pixels and see what kind of refresh rate I get.

That will probably prove my point.
Benchmark your 'write to screen' routine. Depending on the exact Windows routines used, it could be slower then your USB 2.0

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.