Hi everybody!

These days I've been solving the task of sending the bunch of files specified by unix filespec convention (all metacharacters allowed) to the remote host using ftp. I'm thinking of such concept/design, where the main bash script executes the following subtasks:
(1) Expansion of command line arguments to the filelist (file1, file2, ..., fileN);
(2) Opening the FTP client (ftp) on background;
(3) Establishing the connection with the "host";
(4) Sending the file1, file2, ..., fileN in the way one-by-one with an echoing the action and its result into the stdout;
(5) Closing the FTP session running as a child process on background;
(6) That's all folks!
The first 3 subtasks I've done (at least I hope), but I can't write the code for checking the success of file transmission. I have never solved such task before, but I feel that there must be the way/technique in bash how to do it. I was searching on web forums for help, but they mainly recomanded to use expect (tcl relativ script language). I'm sure it can be achieved without expect.

I hope somebody can give me just a little hint to move forward. If it's not clear don't hesitate ask me here or contact me via PM. Many thanks in advance.

p.s.: Sorry for my English.

Recommended Answers

All 4 Replies

Member Avatar for lrirwin

I don't think you can control the ftp client running in the background... It would be a severe security problem since anyone who could pipe commands to the connected client could hijack it...

But, if you created an ftp script first for all the files you wanted to send in a file named "ftpupload.script" that looked like this:

open [your_ftp_site]
user [you_login] [your_password]
prompt
binary
put file1
put file2
put file3
...
put fileN
close
quit

Then you could send the files and post-process the log this way:

PreCount=`grep "^put " ftpupload.script | wc -l`
FTPARGS="-pnv"
ftp ${FTPARGS} > ftp.log 2>&1 <ftpupload.script

Status=99 # Set to invalid value (i.e. failure)
# Common string would be "550 Permission denied" on dup filename
# Or "226 Transfer complete" for each file sent
Count=`grep "^226 " ftp.log | wc -l` # Num of files that made it

# There were files to send and they all made it.
[ ${PreCount} -ne 0 -a ${Count} -eq ${PreCount} ] && Status=0
# No files made it and there were some files to send.
[ ${Count} -eq 0 -a ${PreCount} -gt 0 ] && Status=1
# No files made it, but there weren't any files to send.
[ ${Count} -eq 0 -a ${PreCount} -eq 0 ] && Status=2
# There were files to send, but not all of them made it.
[ ${PreCount} -ne 0 -a ${Count} -ne ${PreCount} ] && Status=3

You could also have the script execute "ls -l" to get the entire list and the bytes in each file to appear in the log after they were all sent... Then run a file by file byte-count match. (you might need to ensure that the folder was empty at the start of the upload session...)

Hope that helps!
Larry Irwin

Member Avatar for lrirwin

(you could use a named pipe as the input vs. "ftpupload.script" and feed your commands to the client via the named pipe - or, if you are across a network, you could have it listen on a port using a "netcat" listener technique and feed your commands to the client via netcat...)

Yes, I can use named pipes (FIFO), but the FTP behaves too strange. I can't keep the control over inter-process communication between bash script (let's call it the main controll loop) and FTP client. Some command makes ftp to reply immediately (for ex. pwd), but there are many such commands, that don't do that (for ex. status). It seems to me, that the FTP client's standard output is buffered and only some specific commands can flush (unbuffer) it.

I hope, that calling FTP using the script command with switches -c and -f will help me somehow.

Sorry I changed my identity, but it was really neccessary. However it doesn't matter...

Hi, I found very useful linux command. It's stdbuf. The best way how to disable pipeline buffer is to run ftp wrapped in the stdbuf environment...

Be a part of the DaniWeb community

We're a friendly, industry-focused community of developers, IT pros, digital marketers, and technology enthusiasts meeting, networking, learning, and sharing knowledge.