What is the mac equivalent of wget?

untill today I hadnt heard of it.. I guess that if you are using the windows wget
it wouldnt be its original platform.. however it is a unix app and is downloadable here
(gnu compatability dependant i think..) you may be able to compile it on your mac
ftp://sunsite.auc.dk/pub/infosystems/wget/
wget-1.9.tar.gz
wget-1.9.tar.gz.sig <i dont know if youre going to need this to compile..

or the windows version and some history of the port here..
http://www.interlog.com/~tcharron/wgetwin.html

still i havnt used either version. and im not a mac junkie so i cant say
anything other than good luck & i hope this helps..

Cain

Hello,

I like the idea of compiling it and seeing if it works for you too. I don't think you need the .sig file with it.

I have also used shell scripts with lynx to obtain files via http:// protocol. I find that handy inside of crontabs.

Christian

Let me explain further.

Every day, twice a day, there is a shell script that is running on the DaniWeb server to make a .sql dump of the database (so that there is always an up-to-date backup handy in case the database gets corrupted, etc.).

Now, of course, every day (or every other day) I want to copy that .sql file over to my home hard drive, just in case of a server crash or problem with my webhost. This way, there is always a very recent database backup in two locations - remotely and locally.

I have always used GUI FTP programs (i.e. SmartFTP for Windows or Transmit for Mac), and that has worked up until recently. Now, the database is just getting way too large for this to be efficient. The database is about half a gigabyte in size right now, and it is a pain to download via a GUI program everyday.

I remember that I used to always use wget when I ran my linux box. Therefore, I thought using it would be a good alternative to a GUI FTP program. Not only that, but I could put a wget command into a shell script, and have it execute automatically every night.

Right now, of course the server is saving the .sql file to a directory outside Apache. But I would be willing to have it save the file to a not publically known directory that is www accessible provided there were an .htaccess file or some other way to securely password protect the directory. Of course it would be very bad if anyone could just download the entire DaniWeb database!

AH!

Well, I would do a few things here first.

1) Modify the shell script on the server to not only copy the file, but compress it. My guess is that the .sql file is a big text database with some code here and there, but text none-the-less. Should crunch down nicely with gzip. If your script on the server times out, have it detach the process with the & sign.

2) On your mac, create a crontab inside the terminal for the user. Crontab formats look like this:

 SHELL=/bin/bash
 TERM=xterm
 MAILTO=emailbox
 # This is a sample Crontab
 # Written just for Dani by Christian
 # Don't tell Tek, as he may get jealous

 # MI HH DD MO DAYOFWEEK   Sun=0 

 # Run something on Monday at 5:10am, every Monday
 10 5 * * 1 (mondaytask.bat > /dev/null)

 # Get her file from her server trigger
 # Run Sun, Tue, Thur, Sat  at 1:25 am
 25 1 * * 0,2,4,6  (filemaint commands... delete old?)
 27 1 * * 0,2,4,6  (copyfromserver.bat > /dev/null)

 # END OF Crontab

Now, she is going to need that batch script. I found a neat thing called expect to handle this on my side.

 #!/usr/bin/expect
 #
 # This is expect script copyfromserver.bat
 # It uses the expect code to prompt for usernames and passwords

 ###
 # Prerequesites
 #
 # We assume that the file exists on a server, and the password will remain static.
 # We assume that the local file has been deleted before this script executes
 ###

 # Revision 0:  Initial deployment

 # Expect has a timeout.  We need to eliminate it.
 # Might create a process problem if network issues arise.
 set timeout -1

 spawn scp remoteaccount@remote domain:/path/filename /local/path/filename
 send "\r"
 expect "password:"
 send "password\r"
 expect "]"

Note that the \r means returns, and they have to be in the send command. Also, the last expect character should be what your system uses... my linux box has a ] near the end of the prompt. Also note that the #!/usr/bin/expect command has no spaces in there, and your local installation of expect may be in a different path.

Make these text files readable to root user only, so that other accounts cannot see the unfortunate cleartext password stored there.

Christian

well .. lets look at your approach..
you want to securely transmit the a file from system a to b.
you could use ssh or more specificly do a dsa keygen for ssh and then copy
the key to $user/.ssh/ on the remote system. (both systems may need to be setup)

next build a custom config file in $user/.ssh & test the connection from a to b.
config file needs to contain the following lines and cannot be configured untill
the user has sshd to the remote system at least once:
Host *
ForwardX11 no
BatchMode yes
Compression yes


last, sftp from the dumpsite to
the remote site with options and put the file via ftp-like auto interface.

then end syntax looks somthing like:

sftp $USER@$HOST:/dumpdir/dumpfile localfiletocopy
nice one-liner for a script.
setup is somewhat tedious and would make a nice writeup.

definately a lo-tek soloution for a poor mans backup.
better than tape crap.. god i hate proprietary tape drives
on intel boxes..

grumble,
Cain

so much attention dani.. decisions decisions.. lol

If you've got Darwin installed on your Mac, you should be able to use wget natively. If not, like everyone else has suggested, compiling it will work. I've done it myself on a G3 running OS X.1, and it worked just fine.

Because this is the first page I came to look when googling for "wget equivalent mac", and this page did not provide the answer, I'll add it, as I found an answer by further looking the results. Sorry for bumping 6 years old thread. Also, I don't know when this was added to Mac OS X, so it is possible that it wasn't available when cscgal asked for it.

curl

Simple as that.

Hey there F-3000. Welcome to DaniWeb! Thank you for that very much long awaited for solution haha! No worries bumping an old thread as long as it is still relevant and useful to everyone out there who might stumble upon this thread from the search engines.

Ha! And one year later I stumble onto the same thread as F-3000, for the same reason.
I'd just like to add that I don't think curl can recursively download a whole website, as wget can, but since I have little need for that, curl is my tool of choice too. See this site for a nice comparison.

commented: Useful information +1
Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.