Vous pourriez facilement utiliser wget pour cela et script comme bon vous semble. Voici un exemple rapide de la façon dont vous pourriez l'utiliser pour télécharger et écraser l'un de vos sites en une seule ligne :
wget ~/Sites/domain/ ftp://[username]:[password]@ftp.example.com/www/
Donc pour faire plusieurs sites web, vous devez utiliser :
wget -P ~/Sites/ -i sites.txt
Et votre fichier texte pourrait ressembler à quelque chose comme ça :
ftp://username:password@ftp.site1.com/www/
ftp://username:password@ftp.site2.com/www/
ftp://username:password@ftp.site3.com/www/
Depuis la page du manuel wget :
Recursive download:
-r, --recursive specify recursive download.
-l, --level=NUMBER maximum recursion depth (inf or 0 for infinite).
--delete-after delete files locally after downloading them.
-k, --convert-links make links in downloaded HTML or CSS point to
local files.
-K, --backup-converted before converting file X, back up as X.orig.
-m, --mirror shortcut for -N -r -l inf --no-remove-listing.
-p, --page-requisites get all images, etc. needed to display HTML page.
--strict-comments turn on strict (SGML) handling of HTML comments.
Recursive accept/reject:
-A, --accept=LIST comma-separated list of accepted extensions.
-R, --reject=LIST comma-separated list of rejected extensions.
-D, --domains=LIST comma-separated list of accepted domains.
--exclude-domains=LIST comma-separated list of rejected domains.
--follow-ftp follow FTP links from HTML documents.
--follow-tags=LIST comma-separated list of followed HTML tags.
--ignore-tags=LIST comma-separated list of ignored HTML tags.
-H, --span-hosts go to foreign hosts when recursive.
-L, --relative follow relative links only.
-I, --include-directories=LIST list of allowed directories.
--trust-server-names use the name specified by the redirection
url last component.
-X, --exclude-directories=LIST list of excluded directories.
-np, --no-parent don't ascend to the parent directory.