0

Hi, I would just like to know the minimum internet speed given latency and bandwidth.

My laptop says it has a 215 ms (milliseconds) latency and an actual measured 0.98 Mbps (Megabits per second) download speed over a connection to Imperial College, ic.ac.uk.

What is an estimate for the expected minimum time to download a 1.3 MB (megabyte) file from Imperial College?

2
Contributors
11
Replies
12
Views
5 Years
Discussion Span
Last Post by biblemdkeid
0

The minimum speed is 0. The maximum is determined by the Bandwidth-Delay Product. There are many factors such as contention, medium, distance, and loss that factor into the equation so it is not an easy answer to provide in general.

0

hi thanks for the quick reply. In fact, this is one of the question I have to solve but
the marker expects me to assume that I count 1 byte as 10 bits and answer this in 2 decimal places. So, it seems that there is an answer for this question. Does it make a difference or is it still a 0?

0

The entire thing sounds suspect to me. Why on earth would someone ask you to count one byte as ten bits? A byte is exactly 8 bits.

Now, using your values you have an abosulte maximum BDP of 26,337.5 [8-bit] bytes. Lets break down how that can be allocated. Assume that every frame you send is at MTU (usually 1500 bytes) - that means at any one point you can have no more than 17.56 frames in flight.

This is all absent any sharing of resources. Lets assume that another person tries to share these resources with you. You have to give up some of your 17.56 frames to the other person. Assuming an entirely fair strategy, you might give up exactly half. Now you are expecting to get 8.78 frames. Continue this calculation and you arrive at something like:

NFRAMES = 17.56 / NFLOWS

With 18 or more flows all trying to consume the resources you can expect less than 1 frame every 215 ms (ignoring nuances of TCP algorithms and frames smaller than 1500) and this is generally enough to cause any flow to timeout. Ultimately, the more consumers you have the less resources you can use and you can always add enough consumers to deplete resources.

As I mentioned, these numbers are entirely different when you add in additional packet loss, queue backup, TCP algorithms, UDP and so on. There is no direct answer to your question. If someone tells you there is you need to ask for more limiting information to provide an accurate answer.

0

to be frankly saying, I cannot understand the solution you have provided above. So are you saying that the minimum download time is 0? or don't know?

0

My apologies. I though you were asking about the minimum bandwith you could achieve.

I would just like to know the minimum internet speed given

I neglected to notice that you would like to estimate the time to download a file. Well, I'm still unclear on the 8 bit vs. 10 bit situation but I will leave that for you to figure out.

It will take exactly 215 ms to fill the link between the two endpoints. You get data at a rate of 0.98 Mbps (980 bits per millisecond assuming 1000bps = 1Kbps). You want to transfer 10905190.4 bits (1.3MB using 8-bit bytes and 1024B = 1KB).

Now it is a simple equation:

10,905,190.4 / 980 = 11,127.75 milliseconds
11,127.75 + 215 = 11,342.75  # remember the latency
11.34 seconds  # convert ms to seconds

This, again, ignores startup time for the connection and other details. It serves as a simple estimate, however.

Edited by L7Sqr

0

hi I am sorry but I don't know why you are dividing 10950190.4 by 980 since I don't know why you are changing 0.98Mbps into 980 bits per milliseconds (isnt' the step should be from Mbps to Kbps and to bps?)

0

by the way, I have used this approach how is this? (assuming 1 bytes = 10 bits)

0.98 Mbps divided by 8 bits per byte is 0.098 MBps. 1.3 MB divided by 0.098 MBps is 13.265306... seconds. Add 0.215 to that for 13.48 (2 decimal places)

How about this?

Edited by biblemdkeid

0

You start by calling a byte 10 bits and then calculate values using 8 bits per byte. You should standardize on an approach.

I'm not sure where you get lost in my derivation - I spelled out each step. I tried to change values into units that allowed for the most direct calculation. Since we know how many bits we have (1.3MB) and how fast we can consume them (0.98Mbps) it is nice to have a direct division to a know unit (milliseconds in our case). It avoids the conversion between 1024 and 1000 when dealing with larger values (since MB and Mbps use different conversions).

Here is the entire conversion at each step with units.

(1) Calculate how many Bytes need to be consumed

1.3 MBytes = 1.3 * 1024Kbytes = 1.3 * 1024 * 1024Bytes = 1363148.8 Bytes

(2) Convert Bytes to bits

8-bit Byte
1363148.8 Bytes = 1363148.8 * 8 = 10905190.4 bits

10-bit Byte
1363148.8 Bytes = 1363148.8 * 10 = 13631488 bits

(3) Calculate how fast we can consume bits

0.98 Mbps : Divide by 1000 to get Megabits per millisecond
0.00098 Mb/ms : Multiply by 1000 to get Kb/ms
0.98 Kb/ms : Multiply by 1000 to get b/ms
980 b/ms <- total number of bits we can consume every millisecond

(4) Determine how long it takes to consume our data

Simple formula is total_bits / bits_per_millisecond = time_in_millisecond

Take result from (2) and divide by result from (3)
(10905190.4 bits) / (980 bits/ms) = 11,127.75 ms (for 8-bit byte)
(13631488 bits) / (980 bits/ms) = 13,909.68 ms (for 10-bit byte)

(5) Add in latency to fill the pipe

11,127.75 ms + 215 ms = 11,342.75 ms (for 8-bit byte)
13,909.68 ms + 215 ms = 14,124.68 ms (for 10-bit byte)

(6) Convert to seconds

11,342.75 ms * 1000 = 11.34 seconds (for 8-bit byte)
14,124.68 ms * 1000 = 14.12 seconds (for 10-bit byte)

0

Thanks for your reply. Another question: okay this is the expected minimum time to download the 1.3 Mb. However, is it possible to find out the expected maximum time to download the file?

0

My answer above is for the theoretical minimal time to transfer. In reality that will never be realized so I would not call it an expected minimum. The expected minimum would be higher than that.

The answer to what is the maximum is: it depends.

What is the protocol?
What are the protocol timeouts?
How many hops from source to sink?
What are the characteristics of each hop?
How many of those hops are shared?
When shared - by how many?

This topic has been dead for over six months. Start a new discussion instead.
Have something to contribute to this discussion? Please be thoughtful, detailed and courteous, and be sure to adhere to our posting rules.