Hey everyone, I'm doing a practice problem and I'm stuck. The question is

"Given an array a of n numbers (say doubles), consider the problem of computing
the average of the first i numbers, for i ranging from 0 to n-1. That is, compute
the array b of length n, where
b[i] = a[0] + a[1] ....+ a[i]/(i + 1)
for 0 <= i < n:
Write a Java program to solve this problem. Your program can generate its own
array, for example a[i] = i + 1. See how large an array you can process in less than
5 seconds. (For full credit it should be at least one million.)
Characterize the time complexity of your algorithm."

Here is my attempt at solving it:

public class largeArray{

   public static void main(String[] args){

      double[] aa = new double[1000000];
      System.out.println(CalculateAvg(aa));




   }
   public static double CalculateAvg(double[] a){
      int i =0;
      double[] array = new double[i];
      a[i] = i + 1;

      for(int k=0; k<array.length; i++){
         double total = a[k]+a[k];
         double sum = ((total)/a[i]);
      }

   return a[i];
  }
}

I'm just lost and any help would be appreciated. Thanks.

Edited 3 Years Ago by jmartzr1

Forget Java for a moment. Get a sheet of paper and a pen and work through a simple example (eg array length 3) by hand. That will get the algorithm clear in your head, and you will find it easier to convert that into Java code.

This article has been dead for over six months. Start a new discussion instead.