Ok, Here is my issue. I have written a script that takes Weblog files that have been GZipped. Parses them outputs the required data to another file. GZipps that file and leaves the original in tact (Actually it unzips it, then rezips it).

Anyway, it runs just fine on my laptop which is running Ubuntu 8.10 Released in October 2008.

So now I am trying to put it in the (pre) production system which is A Windows 2k3 Box running Cygwin. Everytime I run it, it Unzips the Source File. Then I attempt to Open it for Input Before I can do any parsing it dies with an error message of:

Out of memory during request for 1072 bytes, total sbrk() is 402567168 bytes!

Below is a relevant section of code where it fails.

#Just a message for users
    print "Unzipping Information from $fileName ........ \n";

    #Does the "Unzipping"
    $results = `gunzip $SourceDir$fileName`;
  
    #More info for users
    print "Reading Log Information from $fileName ........ \n";    

    #Assign the file name
    $InputFile = $SourceDir.substr($fileName, 0, length($fileName) -3);
    
    if (! open INPUTFILE, "<", $InputFile) { 
       die ("Cannot Open Intput File: $!");
    }
    $i = 0;
    
   foreach (<INPUTFILE>) {

The code dies before the foreach (<INPUTFILE>) section.

when I do a ulimit -a I get the following:

$ ulimit -a
core file size        (blocks, -c) unlimited
data seg size         (kbytes, -d) unlimited
file size             (blocks, -f) unlimited
open files                    (-n) 256
pipe size          (512 bytes, -p) 8
stack size            (kbytes, -s) 2043
cpu time             (seconds, -t) unlimited
max user processes            (-u) 63
virtual memory        (kbytes, -v) 2097152

Any thoughts or help is greatly appreciated.

(Please note that I am new to PERL but not programming )

Thanks,
Paul

Recommended Answers

All 3 Replies

foreach is the wrong command to loop through a file but I am not sure if it is in anyway contributing to the problem. But try this:

while (<INPUTFILE>) {

See if it helps.

That appears to have resolved it. Apparently FOREACH needs to load the entire document to make a determination of the loop size where WHILE just searches for the end of the document.

Thank you for your assistance.

Cool, I suspected as such but was not sure. "foreach" is a list command and expects a list, so I was afraid it might be slurping in the entire file instead of reading the file line by line as a "while" loop does.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.