If you have a compressed file out there on the server, and you want to get it, you actually risk making the download last *longer* by trying to work with a download accelerator. Unless you wish to hack your IP stack, and every router between you and the originating server to adjust the MTU size (NOT LIKELY), there is little you can do to improve your speed.
Back in the days of dialup, using a Kermit, you could adjust the size of the packets for less overhead (sliding windows kermit). With today's networks, it is just not possible.
Files such as .zip or .gz or .sit or .Z are already compressed in software, and unless you have a quick earth-shaking compression algorithm, you will not speed them up any further.
I thought the same about download accelerators at first.
I've got prozilla installed now though, and it's excellent. What it does is create 4 connections (by default, that canbe changed though) to get a file. It's extremely fast. Much better than wget imho.
(to be honest, I don't have statistics on this, but it's an impression I get. I remember downloading a knoppix ISO (+/-700mb) in 10 minutes or so)