5 Linux Commands: timeout, cpulimit,awk,tar and youtube-dl

I know how much you love random linux commands so here I’ve compiled some cool random linux commands to copy, convert, limit,kill and redirect things.

Start COMMAND, and kill it if still running after 5 seconds

timeout 5s COMMAND

Convert Youtube videos to MP3

youtube-dl -t --extract-audio --audio-format mp3 YOUTUBE_URL_HERE
youtube-dl has this functionality built in. If you’re running an older version of youtube-dl, you can update it using `youtube-dl -U` (although if you have an older version, it probably doesn’t download youtube videos anyway.)

youtube-dl –help will show you other options that may come in useful.

Limit the cpu usage of a process

sudo cpulimit -p pid -l 50
This will limit the average amount of CPU it consumes.

Target a specific column for pattern substitution

awk '{gsub("foo","bar",$5)}1' file
Awk replaces every instance of foo with bar in the 5th column only.

Redirect tar extract to another directory

tar xfz filename.tar.gz -C PathToDirectory
The command extracting the tar contents into particular directory …

Linux Command Line and MySQL: Awesome

MySQL is the world’s most popular open source database. Whether you are a fast growing web property, technology ISV or large enterprise, MySQL can cost-effectively help you deliver high performance, scalable database applications. Check out this site MySQL Commands for a nice MySQL cheat sheet.
UrFix.com  however has created a list of commands I use almost daily when monitoring and maintaining my LAMP server. I hope you find these useful…

Monitor the queries being run by MySQL

 

watch -n 1 mysqladmin --user= --password= processlist

Watch is a very useful command for periodically running another command – in this using mysqladmin to display the processlist. This is useful for monitoring which queries are causing your server to clog up.
More info here: http://codeinthehole.com/archives/2-Monitoring-MySQL-processes.html

Backup all MySQL Databases to individual files

 

for I in $(mysql -e 'show databases' -s --skip-column-names);
do mysqldump $I | gzip > "$I.sql.gz"; done

I put this in a cron job to run @ midnight – “lazy back up”

Copy a MySQL Database to a new Server via SSH with one command

 

mysqldump --add-drop-table --extended-insert --force --log-error=error.log
 -uUSER -pPASS OLD_DB_NAME | ssh -C user@newhost "mysql -uUSER -pPASS NEW_DB_NAME"

Dumps a MySQL database over a compressed SSH tunnel and uses it as input to mysql – i think that is the fastest and best way to migrate a DB to a new server!

Convert all MySQL tables and fields to UTF8

 

mysql --database=dbname -B -N -e "SHOW TABLES"  | awk '{print "ALTER TABLE", $1,
 "CONVERT TO CHARACTER SET utf8 COLLATE utf8_general_ci;"}' | mysql --database=dbname
&

Backup a remote database to your local filesystem

 

ssh user@host 'mysqldump dbname | gzip' >
/path/to/backups/db-backup-`date +%Y-%m-%d`.sql.gz

I have this on a daily cronjob to backup the urfix.com database from NearlyFreeSpeech.net (awesome hosts by the way) to my local drive. Note that (on my Ubuntu system at least) you need to escape the % signs on the crontab.

Export MySQL query as .csv file

 

echo "SELECT * FROM table; " | mysql -u root -p${MYSQLROOTPW}
databasename | sed 's/\t/","/g;s/^/"/;s/$/"/;s/\n//g' > outfile.csv

This command converts a MySQL query directly into a .csv (Comma Seperated Value)-file.

Create an SSH tunnel for accessing your remote MySQL database with a local port

 

ssh -CNL 3306:localhost:3306 user@urfix.com

Count the number of queries to a MySQL server

 

echo "SHOW PROCESSLIST\G" | mysql -u root -p | grep "Info:" |
awk -F":" '{count[$NF]++}END{for(i in count){printf("%d: %s\n",
count[i], i)}}' | sort -n

dump a single table of a database to file

 

mysqldump -u UNAME -p DBNAME TABLENAME> FILENAME

And there you have it, a nice list of hopefully useful commands that you can inspect and learn from.

8 Cool Ways To Use SCP

The SCP protocol is a network protocol, based on the BSD RCP protocol, which supports file transfers between hosts on a network. SCP uses Secure Shell (SSH) for data transfer and utilizes the same mechanisms for authentication, thereby ensuring the authenticity and confidentiality of the data in transit. A client can send (upload) files to a server, optionally including their basic attributes (permissions, timestamps). Clients can also request files or directories from a server (download). SCP runs over TCP port 22 by default. Like RCP, there is no RFC that defines the specifics of the protocol.

SCP is an awesome tool. Learn it, Love it, Use it….

Edit a file on a remote host using vim

vim scp://username@host//path/to/somefile

Colored diff ( via vim ) on 2 remotes files on your local computer.

vimdiff scp://root@server-foo.com//etc/snmp/snmpd.conf scp://root@server-bar.com//etc/snmp/snmpd.conf

Restrict the bandwidth for the SCP command

scp -l10 user@urfix.com:/home/urfix/* .

the command is obvious, I know, but maybe not everyone knows that using the parameter “-l” you can limit the use of bandwidth command scp.
In this example fetch all files from the directory zutaniddu and I copy them locally using only 10 Kbs

Compare a remote file with a local file

vimdiff  scp://[@]/

Easily scp a file back to the host you’re connecting from

mecp () { scp "$@" ${SSH_CLIENT%% *}:Desktop/; }

Place in .bashrc and invoke like this: “mecp /path/to/file”, and it will copy the specified file(s) back to the desktop of the host you’re ssh’ing in from. To easily upload a file from the host you’re ssh’ing in from use this:

ucp (){ scp ${SSH_CLIENT%% *}:Desktop/upload/* .; }

scp file from hostb to hostc while logged into hosta

scp user@hostb:file user@hostc:

While at the command line of of hosta, scp a file from remote hostb to remote hostc. This saves the step of logging into hostb and then issuing the scp command to hostc.

Copy something to multiple SSH hosts with a Bash loop

for h in host1 host2 host3 host4 ; { scp file user@$h:/destination_path/ ; }

Just a quick and simple one to demonstrate Bash For loop. Copies ‘file’ to multiple ssh hosts.

scp with compression.

scp -C 10.0.0.4:/tmp/backup.sql /path/to/backup.sql

-C is for compression.

How To Enable Facebook Timeline Today – And A Preview TOO

This morning Facebook announced Timeline, an awesome look at everything that has  happened in your Facebook lifespan. It’s like a story book of your online life.

Facebook is enabling Timeline early for open graph developers. As a reader of UrFix Blog you’ll learn how to get started today.

1. Log into Facebook

2. Enable developer mode, if you haven’t already. To do this, type “developer” into the Facebook search box, click the first result (it should be an app made by Facebook with a few hundred thousand users), and add the app.

3. Jump into the developer app (if Facebook doesn’t put you there automatically, it should be in your left-hand tool bar)

4. Create a new app (don’t worry — you wont actually be submitting this for anyone else to see/use). Give your shiny new app any display name and namespace you see fit. Read through and agree to the Platform Privacy agreement. This is the step you need to be verified for.

5. Ensure you’re in your new app’s main settings screen. You should see your app’s name near the top of the page

6. Look for the “Open Graph” header, and click the “Get Started using open graph” link.

Create a test action for your app, like “read” a “book”, or “eat” a “sandwich”
7. This should drop you into an action type configuration page. Change a few of the default settings (I changed the past tense of “read” to “redd” — again, only you can see this unless you try and submit your application to the public directory), and click through all three pages of settings

8. Wait 2-3 minutes

9. Go back to your Facebook homescreen. An invite to try Timeline should be waiting at the top of the page

Only developers and yourself will be able to see your new profile till September 29th.

 

here is what mine looks like

and even a map of places I’ve check in….

Have fun, and remember this is still in beta so you are bound to find a bug or two.