Hello. I am working on a Perl script that shoulld iterate through ~100 links from a MySQL table and test the links one at a time. I am using rtmpdump/mplayer to test the links (which are streaming videos), and I want to e-mail out a report after testing all the links.

The problem is the script seems to be running multiple times instead of just iterating through the list of all the links once. Also script is leaving the child "rtmpdump" and "mplayer" processes running in the background so I have thousands of orphaned processes after the script has run.

Can someone please take a look at the following logic and perhaps give me a pointer on where I went wrong? As you can see I'm using SIG{ALRM} to terminate the mplayer process after 10 seconds.

Thanks for ANY help or suggestions you can provide, it is much appreciated. Obviously I am very new to Perl and this is one of the more complicated scripts that I've ever written.

#!/usr/bin/perl
use strict;
use warnings;
use DBI;
use MIME::Lite;

require "config2.pl";
require "sendmail.pl";

our $db;
our $user;
our $pass;
our $host;

my $testtimeout = 10;
my $testoutput;

my @myworking;
my @mynotworking;

## Prepare date stamp

my (undef,undef,undef,$mday,$mon,$year,undef,undef,undef) = localtime;
$year += 1900;
$mon += 1;
my $date = sprintf "%04d%02d%02d", ,$year, $mon, $mday;
#print $date;
open (FILE, ">", "results_" . $date . ".html") || die "Could not open file: $!\n";
my $path = "results_" . $date . ".html";

## Stream type selection (only  RTMP for now)

my $streamTypeId = 1; #RTMP
#my $streamTypeId = 2; #RTSP
#my $streamTypeId = 3; #MMS
#my $streamTypeId = 4; #HLS
#my $streamTypeId = 5; #HTTP
#my $streamTypeId = 6; #Web

## Build SQL query

my $query = "SELECT * FROM channels WHERE streamTypeId = $streamTypeId";
my $dbh = DBI->connect("DBI:mysql:$db:$host", $user, $pass);
my $sqlQuery  = $dbh->prepare($query)
or die "Can't prepare $query: $dbh->errstr\n";
my $rv = $sqlQuery->execute
or die "can't execute the query: $sqlQuery->errstr";

while (my $results = $sqlQuery->fetchrow_hashref) {
        print $results->{name} . " = " . $results->{url} . "/" . $results->{fileName} . "\n";
        my $thechannel = "$results->{name}" . ", " . "$results->{url}" . "/" . "$results->{fileName}";

        eval {

        local $SIG{ALRM} = sub { die "alarm\n" };

        alarm $testtimeout;

        $testoutput = `rtmpdump -r $results->{url} -y $results->{fileName} --quiet --live | /usr/bin/mplayer -noconsolecontrols -really-quiet -nocache -nolirc -nomouseinput -identify -vo null -ao null -frames 0 -`;

        my $pid=fork();
        print $pid;


        alarm 0;
        };

        if ($@) {
                die unless $@ eq "alarm\n";
                print "Timed out after $testtimeout seconds. \n";
                push(@mynotworking, $thechannel);
        }
        else {
                print "Didn't time out!\n";
                if ($testoutput =~ m/ID_LENGTH/) {
                        print "Match found! \n";
                        push(@myworking, $thechannel);
                        #print FILE "up,"$thechannel;
                }
                else {
                        print "No match! \n";
                        push(@mynotworking, $thechannel);
                        #print "@mynotworking";
                        #print FILE "down," . "$thechannel";
                }
        }
}

my $mysubject = "My Daily Report " . $date;
print FILE "<b>Working:</b> \n <br>";
foreach (@myworking){print FILE ("$_" . ", working \n <br>")};
print FILE "\n <br>";
print FILE "<b>Not Working:</b> \n <br>";
foreach (@mynotworking){print FILE ("$_" . ", broken \n <br>")};

close FILE;

my $message = do {
        local $/ = undef;
        open (FILE, "<", "results_" . $date . ".html") || die "Could not open file: $!\n";
        <FILE>;
};

print $message;


send_email('noreply@mysite.com','admin@mysite.com',$mysubject, $message);

## Force cleanup of spawned processes

my $cleanup1 = `killall rtmpdump`;
my $cleanup2 = `killall mplayer`;

close FILE;
exit(0);

Anyone? :(

You are forking during every loop iteration, so why are you surprised?

Right, but after I test a row in the database it should force quit the process after 10 seconds via the alarm?

Any idea how I can slow it down so it doesnt crash my machine?

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.