I'm trying to write a script that downloads and image off the net with wget.

The script should save the file named as the date it was downloaded preferably with a suffix.

Any ideas on how to accomplish this using wget?

Sorry, more info:
Using BASH.

The image is on a website.
This script should run twice an hour as a cron job (open to other possibilities also.)

I initially was trying something like:

#!/usr/bin/bash

date=`date +%R%D`

wget image/on/web/named/img.gif

mv img.gif "$date-pic.gif"

BASH doesn't not like the mv command. It thinks the second argument is a directory (amongst many other errors)

What am I thinking about wrong here?

And I thought there'd be a good way to do this with only wget.

Your script should work if you remove the quote marks. However may I suggest something like this. For your crontab entry (assuming your shell script file is in /root/bin and called getpic.sh) use:

00,30 * * * *  /root/bin/getpic.sh <Site URL of file to get> <filename>

For your script use:

#!/bin/bash
cd <path to where you want to store files> 
/usr/bin/wget $1/$2 
mv $2 $2.`date +%s`

Now you have a script you can use on multiple sites with multiple files. The date +%s inside the grave marks (to the left of the 1 below the tilde on most keyboards) will execute the date command and return a large integer and the actual file should show the date created.

%s     seconds since 1970-01-01 00:00:00 UTC

Then you don't have to worry about them being in the correct order in your directory listing or trying to figure out what year, month, day or time they were created. Hope that makes sense and works ok.

Thank you very much.
This performed exactly what I was trying to do.

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.