So I have a problem... according to the specification given by my lab TA, we must
"write a Microsoft Visual Studio console application that will calculate the
following four operations over the elements of a two dimensional array (matrix) of integers:
All of these operations must be performed recursively. (See the design hints on the following
pages. No credit will be given for a simple sequential algorithm!) The program must
work with a matrix of any size. The size will be given as input."
We are also given that
"The class must have the following methods:
1. public int Sum(int[,] matrix, rowNum, colNum)
2. public int Average(int[,] matrix, rowNum, colNum)
3. public int Max(int[,] matrix, rowNum, colNum)
4. public int Min(int[,] matrix, rowNum, colNum)"
The method Main should be in its own class. Main should not be in class MatrixCalculator. In
Main, you will perform the following:
1. Read from console the number of rows for the matrix.
2. Read from console the number of columns for the matrix.
3. Create the matrix and read from console each of the numbers to initialize it. (For
example, if the number of rows is 3 and the number of columns is 3 you need to read 9
4. After reading the numbers display the matrix and a menu with the four operations and
a fifth option “Exit”.
5. When an option is selected, perform the operation and show the result. (Use a switch
statement to invoke the appropriate method).
6. After performing an operation, display the menu and wait for the user to select another
7. The program terminates when the option “Exit” is selected.
The thing I have a problem with is how the heck do I recursively perform those four operations? Even the TA admits this is a stupid way to do this, but for some reason it's still our assignment... Being an inexperienced programmer I am horribly confused by recursive crap and even more so since a 2D array is involved... I have no idea what to do... this is what I've tried so far, but I can't really seem to get it...
//Sums the elements in an array
public int Sum(int[,] matrix, int rowNum, int colNum)
if (colNum == matrix.Length)
return Sum(matrix, rowNum + 1, 0);
else if (rowNum == matrix.Length)
return (matrix[rowNum, colNum] + Sum(matrix, rowNum, colNum + 1));
//Calculates the average of the elements in an array
public int Average(int[,] matrix, int rowNum, int colNum)
//Finds the maximum valued element in array
public int Max(int[,] matrix, int rowNum, int colNum)
//Finds the minimum valued element in array
public int Min(int[,] matrix, int rowNum, int colNum)
and for my other class:
static void Main(string args)
MatrixCalculator matrix = new MatrixCalculator();
I'm completely stuck. Our lecture professor didn't explain this stuff very well.
Obviously, this doesn't conform to the requirements to your assignment and is not meant to. But the logic of the recursion in this example is to start at the upperbounds of the rows and columns and work backgrounds until you reach 0,0 in the array, sort of like using the backspace on a keyboard.
Using what I created for the Sum function as a base, I was able to quickly arrive at a Max function and then the Min function was literally a copy/paste find/replace. The Average function is eluding me for the moment. I don't think it would fly to use the recursive Sum function and simply divide by the number of elements in the 2D array, but I haven't yet arrived at an appropriate algorithm. Maybe tomorrow.
You motivated me to come back to trying to get the average recursively. I tell you, it's tricky for a couple of reasons, the first being that integer math doesn't really translate very well to calculating averages, and then the recursion just complicates it even further.
In integer math, the average of 2 and 3 is 2. You lose the decimal with integers, and all results are rounded towards 0. So if you have -1.9 and convert it to an integer, you get -1. Conversely, the decimal average of 2 and 3 is 2.5, but it shifts to 2 as an integer. You don't round to the nearest whole number, it's whatever number is between the value and 0.
So when you start trying to average a series of integers without really thinking about it, and you develop what you think is an outstanding algorithm (and I did... three times), and then you see the result the program spits out and you compare that to what you know the average to actually be? It hurts.
But in the past few minutes I dumped that whole great and complicated (but still short) method of calculation and finally arrived at a proper routine. Hint? No one says that every step in the recursion must produce an average. You just have to recursively use the Average function and ultimately produce the average by the end of it.
(You're still going to get an 0-shifted integer value if your true average is a decimal, but that's the nature of the beast.)
Oh, and for this one function, I found a good way to use array.Length!