I have a .csv file that could have anywhere from 50 rows to over 10,000. The first 32 rows (geographical header information) will always be ignored. The 2nd column of data will have 0.00 for (x) rows. Once that column starts to read a value I want to start storing that data up the max number in column3 before column3 starts to decrease. I will then want to store all of the data in the 2nd and 3rd columns that were in the same rows. Then I want to output that data to another file(colum3, column2).

Here is a sample of the data:
Column2 at value 0
19/08/2013 10:39:47.000,0,0.009,29.621,-0.002,0.014,-4.227,1508.28
more rows with column2 at 0.
Column2 starts reading a value
19/08/2013 10:51:32.000,1547.122,1.543,29.552,59.068,35.812,22.495,1545.548 Start Data Storage
Column3 reaches its max value
19/08/2013 10:58:23.000,1502.544,223.176,12.228,41.002,35.662,28.057,1502.078 End Data Storage

Where do I start after accessing the .csv file?

    #include <iostream>
    #include <fstream>
    #include <string>
    #include <sstream>

    using namespace std;

    int main()
    {
        string soundvelocity; 
        string pressure;      

        ifstream myfile;

        myfile.open ("C:\\Program Files\\DataLog Express\\SV Cast_Test_Data.csv");


        myfile.close();

        return 0;
    }

I know i will need a for loop for accessing column2 and column3. Is this the right approach?

    // Getting data from 3rd column(pressure)
    for(i=0;i<3;++i)
    {
        getline(input_string_stream,entry2, ',');
    }

    //Getting data from 2nd column(soundvelocity)
    for(i=0;i<2;++i)
    {
        getline(input_string_stream,entry1, ',');
    }

I know this is long but any help would be appreciated.

The way I'd do is to use a mysql database. Load all data into a temporary table and then read in the relevant columns

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.