Well if you have linux then it never hurts to learn the linux commands (cron) whereas if you have Windows you will be using AT instead of cron. In either case getting to know your way around the command line is a good thing.
I had been wanting to find a way to do database dumps daily of my client's websites and store them in .sql files on my server at home.
I have a password protected script on each website that when accessed with the proper code, it's output is a dump of the database in .sql format.
On my server at home I wrote a script that uses file_get_contents() to access the script on each website, if it gets a return, it names it according to db name, date, and time, and saves the sql file, then it checks all saved .sql files for their age, if they are older than 7 days it deletes them.
The script on my local server does not output anything, if the retrieval of the database dump fails, it writes a failure notice to the .sql file and flags the filename so I know it's a bad one.
I set up cron job to use wget to run my local script every night at midnight, and I end up with a backup of all my client's db's every day for the last 7 days.
I suppose I could use wget to get the sql dumps directly, but for me it was just easier to write a local script that would take care of naming and old file deletion.
Also threw a page in my local server to list any .sql files that exist for download or delete them, and the option to run a database dump on the spot.
I sure beats paying for backup service or manually doing backups, plus I have seen hosts have such severe hardware problems that they lost the contents of an entire server AND all backups newer than 9 days old.