Hello all.
I have a small problem.
I have a crewler, that gets into one website and extract some info from it, and writes the exctracted content to file.
The only prob, that in 3~4hours of work, thise script uses 1 gig of ram.

for(my $increment = 1; $increment <= 999999; $increment++){
my $funky = "http://www.website.com/page?id=".$increment;
print $increment."\n";
my $content = get($funky);

THere is the part of the code, i tryied the undef function(at the end of for cycle) but the sam ram was used after 3~4hours of crewling.
So i would like to know, if is it possible to clean the memory at the end of the for cycle, and how .
Thanks in advance.

Why don't you call an external binary e.g. curl to get the content.
You might want to add some headers (at least user agent) so the site doesn't block you by detecting the user agent string.
Executing an external binary will be a separate process and the memory will be released by the OS.

Ok ,thanks for a nice ideia, i am going to try it .But still , for the future would like to know how to manage memory in perl.

Normally in perl you don't worry about memory management. But instead of undef you can assign the variable an empty list/string and see if that helps free up memory.

$var = '';
@var = ();
%var = ();

It helped a bit, at start . But in 20~30 minutes the memory used is same =( . Or possibly im missing some variable. Gonna check. Is there a way to print all the variable names ?

Obviously perl doesn't release the memory until the script exists.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.