for (int i = 0; i<mdim_; i++)
for (int j = 0; j<ndim_; j++)
data_[j * ndim_ + i];
to change the matrix 1 3 2 4 stored in data, into row order. If I print data_[j * ndim_ + i]; to the screen then it will give me 1 2 3 4 however I can't figure out how to save this into a another vector, as this would involve introducing another loop which messes things up.
Can anyone please tell me if there is a way to do this?
Since your 2D data is being stored in 1D arrays, all you need to do here is play indexing games. Suppose you had a matrix of MxN stored in a 1D array M*N in length. If the matrix is stored in column major order, and you need to convert it to row major order, you will have to do some copying. So, try this approach:
0. Given an MxN ( M=width, N=height ) Matrix ( mat0 ) stored in column major order in a 1D array ( arr0 ) of length M*N
1. Create another 1D ( arr1) array of the same size (M*N) that will contain the matrix when it is converted to row major order
2. Iterate over arr0 using an index value ( idx0 ) beginning at index 0 to M*N ( the length of the array )
3. Compute an index ( idx1 ) into the row major matrix the column major array index ( idx0 )
4. Store arr0[idx0] into arr1[idx1]
To do step 3, you should take advantage of integer division and modulo properties. Namely
Well that basically is the beginning...I've been trying for a few days now to append below one matrix to another. All initial matrices are stored in column order which makes this difficult.
Various methods I've tried haven't worked, so I came up with a new idea of transforming both matrices into row order, sticking them together, and transforming back into column order according to the number of rows of both matrices.