Hello all,
I'm trying to read in an entire tab-delimited files with the following data layout:

File Title Line is always at the top.
DataName1 DataNam2 DataName3
12345 123.4 12.456
9876 987.65 45678

There may be any number of rows and columns after the title line.

I want to read in the file and output separate files for each column. For instance:

DataName1.txt would contain:
12345
9876

DataName2.txt would contain:
123.4
987.65

DataName3.txt would contain:
12.456
45678


I think Perl may be best for this specific problem though I could also use C++. Anyone have an easy solution they wouldn't mind sharing?

Thanks!

Member Avatar for Dukane

Open the first file, then read each line into an array. Pop off the top 2 lines, as they are not data useful for the program.

For each line in the file:
Read the first column, then add it to an array. You can do this by using split();
Read the second column then add it to a second array.
Read the third column then add it to a third array.

Close the file.

Open a new file, for the first column.
Print the whole array into the file.
Close the file.

Open a second file, for the second column.
Print the whole array into the second file.
Close the file.

Open a third and last file, for the last column.
Print the whole array into the third file.
Close the file.

Perl script is complete.

brax4444,

what perl code have you tried so far? Is this class/school work?

I was hoping for sample that would solve this simple case so that I might have a framework to look at when I solve the real problem. I'm basically looking for a good example of Perl code to handle this type of thing. I'll paste what I have so far tomorrow. Thanks.


School? No; school isn't in session anyway. :)

Well then, here is one possble way:

use strict;
use warnings;
use IO::File;
open FH,'datafile.txt' or die "$!";
my @files = split(/\s+/,<FH>);
for (@files) {
   my $f = "$_.txt";
   $_ = new IO::File "> $f" or die "$!";
}	
while(my @data = split(/\s+/,<FH>)) {
   for (@files) {
      print $_ shift @data,"\n";
   }
}
for (@files) {
   close->$_;
}

Thank you! I have begun adapting it.

Any idea what this would mean? Perhaps perl is running out of memory? I have 2GB of RAM but don't know what it can allocate for it's own use.

Can't call method "IO::File=GLOB(0x818417c)" without a package or object reference at internetTest.pl line 17, <FH> line 1370.

let's see your perl code and what is line number 1370 of the file you are working with?

Generally there is no memory allocation necessary for perl scripts. Perl will use what memory is needed all they way to using all memory if thats what it takes.

Line 1370 is the last line and it looks exaclty like the rest of them. Perhaps it should have an endline character??

Here is [perlcode] echo "
#!/usr/bin/perl
use strict;
use warnings;
use IO::File;
open FH,'file.txt' or die "$!";
my @files = split(/\s+/,<FH>);
for (@files) {
my $f = "$_.txtt";
$_ = new IO::File "> $f" or die "$!";
}
while(my @data = split(/\s+/,<FH>)) {
for (@files) {
print $_ shift @data,"\n";
}
}
for (@files) {
close->$_;
}
";
[/perlcode]

I believe this is an exact copy of what you gave me. I'm trying to have it start on the second line instead of the first. For now, I just remove the first line in the input file and it works good enough to test.

OK, so it's working now?

No, it always gives an error as shown when it handles the last line in a file, apparently no matter what size.

take out this part of the code:

for (@files) {
close->$_;
}

and see if that helps.

That fixed the problem! Thanks. :) Explanation, please?

it should have been written like this:

for (@files) {
$_->close;
}

I had the object and the method reversed for some strange reason.

Muchas Gracias.

Should not be trying to open and close files if you can help it.

`cut -f 1 file> Data1`;

and so on..
then do a sed to remove the first two lines of all the Data(num) files.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.