I am trying to run a simple script to monitor a log file, where I need to send an alarm every time specific line is being written into this log file, currently my script is working fine, the problem is with the miltiple lines that is being written whenever I have a problem with my SW and many lines are logged into this log file which is causing many alarms to be sent, so I am seeking a way to trigger sending the alarm only one time but without existing this script, I need my script to keep running 24/7 to monitor any errors in log's, but when the script captures an error lines in log file, I want to send the alarm one time, for example, it will be fine for me if I can send one alarm for error occurrence in log file for each 1 or 2 minutes, but sometimes I am having more than 100 error written within seconds causing more than 100 alarm to be triggered.
This is my script:
tail --pid=$$ -F /usr/fuad/testing_alarms/errors_log.log | while read line ; do
echo "$line" | egrep -v --line-buffered "error"
if [ $? = 0 ];
Send an SMS
I have also tried something like the below, but also each line written into the file errors_log.log is triggering the alarm causing many alarms to be sent.
tail --pid=$$ -F /usr/fuad/testing_alarms/errors_log.log | egrep --line-buffered "error.Link.." *| /usr/fuad/testing_alarms_fuad/send_alarm.pl
What I really need to do, whenever a line contains the string "error" is being written into the file errors_log.log, I want to trigger the script "send_alarm.pl", but when I have 100 line with string "error" written into errors_log.log within 3-4 seconds, causing 100 alarm to be sent, this is a problem, so how can I avoid duplicates without the option's of exiting the scripts as I need it to keep running and checking for errors? I want to avoid duplicates that appear within X period of time at least, so if alarms is sent due duplicates every 2 minutes for example will be fine but I dont want to trigger alarms for all errors appearence specially when they are a lot within seconds.
Thank you in advance.