As a web developer, I often encounter clients who have a hosting package that is limited, or ‘secured’ by the hosting provider. That means I sometimes am forced to use the dreaded FTP for file transfers rather than SCP.
This is ok when it’s just a few files here or there. In fact, using the GUI can sometimes be convenient. However, if I’m doing large scale development, I’d rather copy the site over to my development server rather than work locally and fill up my hard drive.
If the client’s hosting provider does not allow SSH access, then you can use FTP from the commandline 🙂
ftp ftp.example.com will get you there. Then depending upon your flavor of linux, you can use MGET to pull files. Sometimes you are even offered the awesomeness of RECURSIVE MGET *.
When you are not afforded that goodness, I’ve discovered that WGET does the trick even better 🙂
wget -r “ftp://user:[email protected]/somedirectory”
That’ll recursively get it all for you 🙂 Better yet…mirror
wget -m “ftp://user:[email protected]/somedirectory”
That initiates recursive and gives you infinite depths on directories…and…gives you the same timestamps as exists on the remote server.
Nice stuff.