Hi I need anybody to help me this command in UNIX SHell Scripting.

It goes to a website and take a snap shot and stores it in directory. Thanks

ok I can use wget to get the link of website. How about if I have like 300 websites? Can you plz help me how should I retrieve all those 300 urls and take the snapshot of each one of them?

Your help will be appreciated.

Thanks

How about if I have like 300 websites? Can you plz help me how should I retrieve all those 300 urls and take the snapshot of each one of them?

I have no idea.
Be more specific please. Where did you get those 300 urls from?

if you had a list of the web sites in a file you could use a for loop read each line and perform a command:

SITELIST=/path/to/file
SNAPSHOTDIR=/tmp

for i in $SITELIST
  do
      #command e.g. 
      echo "Taking snapshot of $i"
      wget -r -l1 -N -k -x $i > ${SNAPSHOTDIR}/$i
  done

It goes to a website and take a snap shot and stores it in directory. Thanks


What do you mean by a "snapshot"?

Do you want the source code (i.e., HTML) of the page, or do you want to take a screen shot of a browser window with the page displayed?

if you had a list of the web sites in a file you could use a for loop read each line and perform a command:

SITELIST=/path/to/file
SNAPSHOTDIR=/tmp

for i in $SITELIST
  do
      #command e.g. 
      echo "Taking snapshot of $i"
      wget -r -l1 -N -k -x $i > ${SNAPSHOTDIR}/$i
  done


That doesn't read the file; it gives a single argument to for: the name of the file.

To read the file:

cd  "$SNAPSHOTDIR" || exit 1
while IFS= read -r i
do
      wget -r -l1 -N -k -x "$i"
done < "$SITELIST"

Edited 6 Years Ago by cfajohnson: n/a

sorry was being stupid:

SITELIST=/path/to/file_containing_listofsites
SNAPSHOTDIR=/tmp/to/output/to
cat $SITELIST | while read NEXTLINE
do
      #command e.g. 
      echo "Taking snapshot of $NEXTLINE"
      wget -r -l1 -N -k -x $i > ${SNAPSHOTDIR}/$NEXTLINE
done

sorry was being stupid:

SITELIST=/path/to/file_containing_listofsites
SNAPSHOTDIR=/tmp/to/output/to
cat $SITELIST | while read NEXTLINE
do
      #command e.g. 
      echo "Taking snapshot of $NEXTLINE"
      wget -r -l1 -N -k -x $i > ${SNAPSHOTDIR}/$NEXTLINE
done

UUOC!

UUOC!

Let's see you do better. If you're going to shout UUOC, you should offer a solution which doesn't use cat. If you're unable to do so, stop posting UUOC or I'll tread it as trolling :icon_frown:

Edited 6 Years Ago by Nick Evan: n/a

Let's see you do better. If you're going to shout UUOC, you should offer a solution which doesn't use cat. If you're unable to do so, stop posting UUOC or I'll tread it as trolling :icon_frown:


As has been posted many times here and elsewhere, the way to read a file line by line is:

while IFS= read -r line
do
  : do whatever with line
done < "$filename"
Comments
Worth repeating again, it appears.

BTW, wget also takes input from a file, so if you had a list of URLs in a file, you could use 'wget -i'

Check out the wget man page for some of the other options you can use. It has great capabilities for mirroring sites and things, which is what I assume you want to do here. (check out --convert-links too, if you have trouble with links in your local copies!)

I hope this helps!
-G

This article has been dead for over six months. Start a new discussion instead.