hello im trying to make a shell script that "downloads" links and topic that is to the link but i cant make it work. It is supposed to get the urls from a txt file or a file.

IFS=`echo -e "\n\r"`
for I in `cut -f1 $1`
do
source=$(echo $I)
echo "Kilde: $source"

lynx -dump $source > /tmp/temp

if test -z $1; then
echo "fd"
fi

hva=$(cat /tmp/temp | grep "[0-9]. http://") #| cut -d'.' -f1 | sed -e s/\ //g )
for nr in $hva
do
#echo "-$nr-"
	nummer=$(echo $nr | cut -d. -f1 | sed -e "s/\ //g")
	over=$(cat /tmp/temp | grep -G "[[]$nummer[]]" | head -n 1| cut -d']' -f2)
	url=$(cat /tmp/temp | grep -G "$nummer. http://" | cut -d. -f 2- | tail -n 1) # | sed -e "s/\ //g")
	#echo "<a href=\"$url\">$over</a><br>"

echo "<a href=\"$url\">$over</a><br>"
done
done
rm /tmp/temp

if any1 can solve this thing idd be greatfull

Hi,

It works for me. What problem you got??

problem is taht the links isnt the same ase the links tekst for instanse
<a href="www.the-sun.co.uk">The Sun</a> but
<a href="www.the-sun.co.uk">CNN</a>
thats the problem ive got. I need to get the numbers that lynx make to find eachother

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.