I'm so confused on how to calculate the running time of an algorithm. i've searched online but i can't seem to understand any of the explanations hence my question: how do you calculate the running time of any algorithm? Please i'd love an explanation with any example or you can use the one given below. Thank you. Any help at all is very much appreciated..

The Max Average problem is defined as follows:

Input: array A[1..n] containing signed integers A[1]...A[n]

Output: max Aver(i,j) fulfilling 1 ≤ i ≤ j ≤ n where Aver(i,j) = sum(i, j)/(j-i+1) is an average and a function sum(i, j) is a sum of all values from index i to index j: A* + ... +A[j].*

For example, if A[1..8] = {1,-2,5,-1,-3,3,-2,7} then max average is 9/6 for i=3 and j=8. That is, it does not exists any other two indexes i, j such that 1 ≤ i ≤ j ≤ 8 and Aver(i,j) > 9/6.

```
(sum[j] holds max of all sums ending in A[j])
maxsum := A[1]
create array sum[n] of n size
sum[1] = A[1] // set first maxsum
for(j := 2; j ≤ n; j++) // iterate over j
sum[j] := max(sum[j-1] + A[j], A[j]) // calculate max sum(i,j) for j and all i such that i ≤ j if(sum[j] > maxsum) // remember max
then maxsum := sum[j]; maxAver:=maxsum/(j-i+1)
return maxAver
```

thank you..