Until now, I was used to exporting static sites with a simple rsync
, which synchronizes the locally generated directory with the files on the server. However, I had to deal with a shared server only accessible through FTP. For me, it was a tool I hadn’t used since the 2000s, with CuteFTP and then FileZilla, dragging and dropping files… Fortunately, the command line tool lftp
simply allows to maintain a remote copy “in mirror” with a local directory.
Connection
To avoid typing your login credentials in the terminal or logging in automatically, let’s create a ~/.netrc
file containing:
# ~/.netrc
machine yourserver login youruser password yourpassword
We can then connect via FTP with:
lftp user@domain.tld
It is also possible to connect with SFTP, which is more secure, by specifying the protocol and port with the following command:
lftp sftp://user@domain.tld:22
It is possible that the first time you type a command, you will get the error Fatal error: Host key verification failed
. This is because the RSA key fingerprint is missing from the list of known servers. To fix this, a simple ssh user@domain.tld
followed by the yes
response when asked if you accept the server.
Commands
To get a remote copy in a www
directory that is exactly the same as a local public
directory, use:
mirror -R public www --delete
I also use the options --ignore-time
(to send only the necessary sites), --parallel 5
to have several simultaneous connections, which speeds up the sending substantially (the possible number depends on your connection and your host), and -v
to see the actions performed.
Automated script
To automatically execute these commands in a script, just use EOF
and the quit 0
command at the end. For example, with FTP (without a secure connection):
lftp user@domain.tld << EOF
mirror -R public www --delete --ignore-time --parallel 5 -v
quit 0
EOF
And with SFTP (connection via SSH):
lftp sftp://user@domain.tld:22 << EOF
mirror -R public www --delete --ignore-time --parallel 5 -v
quit 0
EOF