Ever since I started my blog, I was looking for a neat solution to upload my internet pages whenever Nanoblogger updates the local copy.
I didn’t want just a recursive upload, because some directories (e.g. data) shouldn’t be publicly accessible. My first attempt was a script using ncftpput, but it always uploads the whole site.
I wanted a more intelligent solution that copies only new and changed files. This is presumably a very common tasks, there should be plenty of ways to achieve it. But almost all FTP clients capable of intelligent uploads involve some GUI. The only command line client I could find is lftp (available on OS X from Fink and from Darwinports).
So here is my script:
#!/bin/sh # # publish a nanoblogger blog # publishing variables PUBLISH_HOST="www.your.blog" PUBLISH_LOGIN="yourlogin" PUBLISH_PATH="/path/to/your/blog" PUBLISH_PATTERN="-X * -I archives/ -I images/ -I styles/ -I articles/ -I smilies/" PUBLISH_PATTERN="$PUBLISH_PATTERN -I *.html -I *.xml -I *.rdf -I *.css -I robots.txt" PUBLISH_PATTERN="$PUBLISH_PATTERN -I *.png -I *.jpg -I *.gif -I *.ico" lftp -c "open -u $PUBLISH_LOGIN $PUBLISH_HOST; mirror -v -R $PUBLISH_PATTERN . $PUBLISH_PATH"
Of course you have to adjust the first three variables. The PUBLISH_PATTERN variable should be okay for the current version of Nanoblogger.