 Hey everyone, I'm doing a practice problem and I'm stuck. The question is

"Given an array a of n numbers (say doubles), consider the problem of computing
the average of the first i numbers, for i ranging from 0 to n-1. That is, compute
the array b of length n, where
b[i] = a + a ....+ a[i]/(i + 1)
for 0 <= i < n:
Write a Java program to solve this problem. Your program can generate its own
array, for example a[i] = i + 1. See how large an array you can process in less than
5 seconds. (For full credit it should be at least one million.)
Characterize the time complexity of your algorithm."

Here is my attempt at solving it:

``````public class largeArray{

public static void main(String[] args){

double[] aa = new double;
System.out.println(CalculateAvg(aa));

}
public static double CalculateAvg(double[] a){
int i =0;
double[] array = new double[i];
a[i] = i + 1;

for(int k=0; k<array.length; i++){
double total = a[k]+a[k];
double sum = ((total)/a[i]);
}

return a[i];
}
}
``````

I'm just lost and any help would be appreciated. Thanks.

## Recommended Answers

just lost .. what exactly is it you're stuck on? can you be a bit more specific?

## All 2 Replies

just lost .. what exactly is it you're stuck on? can you be a bit more specific?

Forget Java for a moment. Get a sheet of paper and a pen and work through a simple example (eg array length 3) by hand. That will get the algorithm clear in your head, and you will find it easier to convert that into Java code.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of 1.21 million developers, IT pros, digital marketers, and technology enthusiasts learning and sharing knowledge.