Some nifty command line tools for Google Drive.
https://cloud.google.com/storage/docs/gsutil
Some nifty command line tools for Google Drive.
https://cloud.google.com/storage/docs/gsutil
This is a little dumping ground for me to use to store useful recipes.
Find a file that was created between two ranges, say on the same day…
touch -t 201608200000 start
touch -t 201608202359 stop
find . -newer start \! -newer stop
Find all files with a particular name or extension and delete if you want
find . -name "*.bak" -type f -delete
Just run without -delete
to review before you do it
Find files or directories with a certain permissions set, or without a certain permissions set
find files that don’t have permissions of 644
find /path/to/dir/ -type f ! -perm 0644 -print0
find files that don’t have permissions of 644 and change them
find /path/to/dir/ -type f ! -perm 0644 -print0 | xargs -0 chmod 644
counts all .php and .html files from the current directory that aren’t under the “includes” or “forum” directories.
wc runs wordcount on each file that matches. the “tr” through “bc” takes those numbers and adds them up.
find . -not \( -path ./includes -prune \) -not \( -path ./forum -prune \) -regex ‘.*/.*\(php\|html\)’ -exec wc -l \{\} \; | tr -s ‘ ‘ ‘ ‘ | cut -d ‘ ‘ -f 2 | paste -sd+ – | bc
Ok, it’s been driving me nuts for a while now that I’ve not been able to migrate files from a webserver directly up to an Amazon S3 account. I know this has been around a bit, but I finally found it and had great success 🙂
Tim Kay created his own tool that can be installed on most servers 🙂 Check out his aws tool.
As a web developer, I often encounter clients who have a hosting package that is limited, or ‘secured’ by the hosting provider. That means I sometimes am forced to use the dreaded FTP for file transfers rather than SCP.
This is ok when it’s just a few files here or there. In fact, using the GUI can sometimes be convenient. However, if I’m doing large scale development, I’d rather copy the site over to my development server rather than work locally and fill up my hard drive.
If the client’s hosting provider does not allow SSH access, then you can use FTP from the commandline 🙂
ftp ftp.example.com will get you there. Then depending upon your flavor of linux, you can use MGET to pull files. Sometimes you are even offered the awesomeness of RECURSIVE MGET *.
When you are not afforded that goodness, I’ve discovered that WGET does the trick even better 🙂
wget -r “ftp://user:[email protected]/somedirectory”
That’ll recursively get it all for you 🙂 Better yet…mirror
wget -m “ftp://user:[email protected]/somedirectory”
That initiates recursive and gives you infinite depths on directories…and…gives you the same timestamps as exists on the remote server.
Nice stuff.