UrFix's Blog

A geek without a cause

  • Top 25 SED Commands

    by

    Sed is a stream editor. A stream editor is used to perform basic text transformations on an input stream While in some ways similar to an editor which permits scripted edits (such as ed), sed works by making only one pass over the input(s), and is consequently more efficient. But it is sed’s ability to filter text in a pipeline which particularly distinguishes it from other types of editors.

    Here are the top 25 SED commands voted by everyone

    1) Stream YouTube URL directly to mplayer.

    mplayer -fs $(echo “http://youtube.com/get_video.php?$(curl -s $youtube_url | sed -n “/watch_fullscreen/s;.*\(video_id.\+\)&title.*;\1;p”)”)

    This is the result of a several week venture without X. I found myself totally happy without X (and by extension without flash) and was able to do just about anything but watch YouTube videos… so this a the solution I came up with for that. I am sure this can be done better but this does indeed work… and tends to work far better than YouTube’s ghetto proprietary flash player ;-)

    Replace $youtube_url with any youtube URL you want and this will scrape the site for the _real_ URL to the full quality .FLV file on Youtube’s server and will then will hand that over to mplayer (or vlc or whatever you want) to be streamed.

    In some browsers you can replace $youtube_url with just a % or put this in a shell script so all YouTube URLs can be handed directly off to your media player of choice for true streaming without the need for Flash or a downloader like clive. (I do however fully recommend clive if you wish to archive videos instead of streaming them)

    If any interest is shown I would be more than happy to provide similar commands for other sites. Most streaming flash players use similar logic to YouTube.

    Also, if you want to download videos, just add the -dumpstream option to mplayer.

    2) Google Translate

    translate(){ wget -qO- “http://ajax.googleapis.com/ajax/services/language/translate?v=1.0&q=$1&langpair=$2|${3:-en}” | sed ‘s/.*”translatedText”:”\([^”]*\)”.*}/\1\n/’; }

    Usage:

    translate <phrase> <source-language> <output-language>Example:

    translate hello en esSee this for a list of language codes:

    http://en.wikipedia.org/wiki/List_of_ISO_639-1_codes

    3) Print all the lines between 10 and 20 of a file

    sed -n ‘10,20p’ <filename>

    Similarly, if you want to print from 10 to the end of line you can use: sed -n ’10,$p’ filename
    This is especially useful if you are dealing with a large file. Sometimes you just want to extract a sample without opening the entire file.

    4) Graphical tree of sub-directories

    ls -R | grep “:$” | sed -e ‘s/:$//’ -e ‘s/[^-][^\/]*\//–/g’ -e ‘s/^/   /’ -e ‘s/-/|/’

    Prints a graphical directory tree from your current directory

    5) Check your unread Gmail from the command line

    curl -u username –silent “https://mail.google.com/mail/feed/atom” | perl -ne ‘print “\t” if /<name>/; print “$2\n” if /<(title|name)>(.*)<\/\1>/;’

    Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages.

    6) To print a specific line from a file

    sed -n 5p <file>

    You can get one specific line during any procedure. Very interesting to be used when you know what line you want.

    7) Find geographical location of an ip address

    lynx -dump http://www.ip-adress.com/ip_tracer/?QRY=$1|grep address|egrep ‘city|state|country’|awk ‘{print $3,$4,$5,$6,$7,$8}’|sed ‘s\ip address flag \\’|sed ‘s\My\\’

    I save this to bin/iptrace and run “iptrace ipaddress” to get the Country, City and State of an ip address using the http://ipadress.com service.

    I add the following to my script to get a tinyurl of the map as well:

    URL=`lynx -dump http://www.ip-adress.com/ip_tracer/?QRY=$1|grep details|awk ‘{print $2}’`

    lynx -dump http://tinyurl.com/create.php?url=$URL|grep tinyurl|grep “19. http”|awk ‘{print $2}’

    8) Convert PDF to JPG

    for file in `ls *.pdf`; do convert -verbose -colorspace RGB -resize 800 -interlace none -density 300 -quality 80 $file `echo $file | sed ‘s/\.pdf$/\.jpg/’`; done

    (relies on ‘imagemagick’)

    This command will convert all .pdf files in a directory into a 800px (wide or height, whichever is smaller) image (with the aspect ratio kept) .jpg.

    If the file is named ‘example1.pdf’ it will be named ‘example1.jpg’ when it is complete.

    This is a VERY worthwhile command! People pay hundreds of dollars for this in the Windows world.

    My .jpg files average between 150kB to 300kB, but your’s may differ.

    9) Remove a line in a text file. Useful to fix “ssh host key change” warnings

    sed -i 8d ~/.ssh/known_hosts

    10) Recursive search and replace old with new string, inside files

    grep -rl oldstring . |xargs sed -i -e ‘s/oldstring/newstring/’

    recursively traverse the directory structure from . down, look for string “oldstring” in all files, and replace it with “newstring”, wherever found

    also:

    grep -rl oldstring . |xargs perl -pi~ -e 's/oldstring/newstring'

    11) Efficiently print a line deep in a huge log file

    sed ‘1000000!d;q’ < massive-log-file.log

    Sed stops parsing at the match and so is much more effecient than piping head into tail or similar. Grab a line range using

    sed '999995,1000005!d' < my_massive_file

    12) View all date formats, Quick Reference Help Alias

    alias dateh=’date –help|sed “/^ *%a/,/^ *%Z/!d;y/_/!/;s/^ *%\([:a-z]\+\) \+/\1_/gI;s/%/#/g;s/^\([a-y]\|[z:]\+\)_/%%\1_%\1_/I”|while read L;do date “+${L}”|sed y/!#/%%/;done|column -ts_’

    If you have used bash for any scripting, you’ve used the date command alot. It’s perfect for using as a way to create filename’s dynamically within aliases,functions, and commands like below.. This is actually an update to my first alias, since a few commenters (below) had good observations on what was wrong with my first command.

    # creating a date-based ssh-key for askapache.github.com

    ssh-keygen -f ~/.ssh/`date +git-$USER@$HOSTNAME-%m-%d-%g` -C 'webmaster@askapache.com' # /home/gpl/.ssh/git-gplnet@askapache.github.com-04-22-10# create a tar+gzip backup of the current directory

    tar -czf $(date +$HOME/.backups/%m-%d-%g-%R-`sed -u 's/\//#/g' <<< $PWD`.tgz) . # tar -czf /home/gpl/.backups/04-22-10-01:13-#home#gpl#.rr#src.tgz .I personally find myself having to reference

    date --help quite a bit as a result. So this nice alias saves me a lot of time. This is one bdash mofo. Works in sh and bash (posix), but will likely need to be changed for other shells due to the parameter substitution going on.. Just extend the sed command, I prefer sed to pretty much everything anyways.. but it’s always preferable to put in the extra effort to go for as much builtin use as you can. Otherwise it’s not a top one-liner, it’s a lazyboy recliner.

    Here’s the old version:

    alias dateh='date --help|sed "/^ *%%/,/^ *%Z/!d;s/ \+/ /g"|while read l;do date "+ %${l/% */}_${l/% */}_${l#* }";done|column -s_ -t'This trick from  [ http://www.askapache.com/linux-unix/bash_profile-functions-advanced-shell.html bash_profile ]

    13) Generate a Random MAC address

    MAC=`(date; cat /proc/interrupts) | md5sum | sed -r ‘s/^(.{10}).*$/\1/; s/([0-9a-f]{2})/\1:/g; s/:$//;’`

    Original author unknown (I believe off of a wifi hacking forum).

    Used in conjuction with ifconfig and cron.. can be handy (especially spoofing AP’s)

    14) Change prompt to MS-DOS one (joke)

    export PS1=”C:\$( pwd | sed ‘s:/:\\\\\\:g’ )\\> “

    15) Delete the specified line

    sed -i 8d ~/.ssh/known_hosts

    16) geoip information

    curl -s “http://www.geody.com/geoip.php?ip=$(curl -s icanhazip.com)” | sed ‘/^IP:/!d;s/<[^>][^>]*>//g’

    This script gives a single line as shown in the sample output.

    17) Stream YouTube URL directly to mplayer

    id=”dMH0bHeiRNg”;mplayer -fs http://youtube.com/get_video.php?video_id=$id\&t=$(curl -s http://www.youtube.com/watch?v=$id | sed -n ‘s/.*, “t”: “\([^”]*\)”, .*/\1/p’)

    The original doesn’t work for me – but this does. I’m guessing that Youtube updated the video page so the original doesn’t work.

    18) Another Curl your IP command

    curl -s http://checkip.dyndns.org | sed ‘s/[a-zA-Z<>/ :]//g’

    Just another curl command to get your public facing IP

    19) Print just line 4 from a textfile

    sed -n ‘4{p;q}’

    Prints the 4th line and then quits. (Credit goes to flatcap in comments: http://www.commandlinefu.com/commands/view/6031/print-just-line-4-from-a-textfile#comment.)

    20) Working random fact generator

    wget randomfunfacts.com -O – 2>/dev/null | grep \<strong\> | sed “s;^.*<i>\(.*\)</i>.*$;\1;”

    Though without infinite time and knowledge of how the site will be designed in the future this may stop working, it still will serve as a simple straight forward starting point.

    This uses the observation that the only item marked as strong on the page is the single logical line that includes the italicized fact.

    If future revisions of the page show failure, or intermittent failure, one may simply alter the above to read.

    wget randomfunfacts.com -O - 2>/dev/null | tee lastfact | grep \<strong\> | sed "s;^.*<i>\(.*\)</i>.*$;\1;"

    The file lastfact, can then be examined whenever the command fails.

    21) Given process ID print its environment variables

    sed ‘s/\o0/\n/g’ /proc/INSERT_PID_HERE/environ

    22) Pronounce an English word using Dictionary.com

    pronounce(){ wget -qO- $(wget -qO- “http://dictionary.reference.com/browse/$@” | grep ‘soundUrl’ | head -n 1 | sed ‘s|.*soundUrl=\([^&]*\)&.*|\1|’ | sed ‘s/%3A/:/g;s/%2F/\//g’) | mpg123 -; }

    23) find and replace tabs for spaces within files recursively

    find ./ -type f -exec sed -i ‘s/\t/  /g’ {} \;

    24) count IPv4 connections per IP

    netstat -anp |grep ‘tcp\|udp’ | awk ‘{print $5}’ | sed s/::ffff:// | cut -d: -f1 | sort | uniq -c | sort -n

    usefull in case of abuser/DoS attacks.

    25) display an embeded help message from bash script header

    [ “$1” == “–help” ] && { sed -n -e ‘/^# Usage:/,/^$/ s/^# \?//p’ < $0; exit; }

    With this one liner you can easily output a standard help message using the following convention:

    Usage: is the start marker

    Stop at the last #

    wget randomfunfacts.com -O – 2>/dev/null | grep \<strong\> | sed “s;^.*<i>\(.*\)</i>.*$;\1;”Working random fact generator

    And there you have it the top 25 Voted SED commands
    These where brought to you by commandlinefu
    Check them out and if possible add your own command line gem
    and remember when in doubt check the man pages.

  • Webmasters Block These IP Addresses

    by

    These Spam Referrers have been very diligent to show us their websites so I will show you their IP’s

    But first let me show you a few ways of blocking IP’s

    Block with null routes
    once you reboot the routes will be gone so this is only a temp fix
    route add IP-ADDRESS gw 127.0.0.1 lo

    You can also use reject target
    route add -host IP-ADDRESS reject

    But you might want to do something more permanent like using IPTABLES
    iptables -A INPUT -s IP-ADDRESS -j DROP

    To keep my server light I did not include IPTABLES in the kernel so I wrote a script to make my life easier

    count=0
    for i in `cat spamlist.txt`
    do
    count=`expr $count + 1`
    echo “Line $count is being displayed”
    route add $i gw 127.0.0.1
    echo $i
    echo counting
    done
    echo “End of file”
    echo done

    Just add the IP Addresses below to a file called spamlist.txt
    give the script execute permission

    chmod +x scriptfilename.sh

    and BAM!

    So without further or due I give you a great list of spammers
    you asked for it spammers

    173.234.152.188
    173.234.52.34
    69.163.147.151
    109.111.184.1
    213.163.97.20
    184.82.38.7
    221.132.73.146
    173.234.13.162
    98.165.84.55
    173.203.243.138
    173.234.182.162
    173.208.95.16
    221.249.73.61
    38.99.89.252
    82.117.226.27
    173.208.124.63
    173.208.70.99
    72.44.50.58
    126.15.1.32
    72.188.60.3
    173.234.12.178
    118.69.192.62
    173.234.47.192
    173.234.46.171
    113.139.182.178
    204.124.183.196
    64.120.31.210
    173.234.153.152
    173.234.183.240
    79.2.190.199
    173.208.67.232
    202.198.164.114
    95.244.108.249
    174.34.171.6
    67.160.221.57
    24.121.181.7
    173.234.93.110
    173.234.92.138
    187.48.56.221
    173.203.78.165
    193.198.185.3
    62.215.5.66
    173.208.14.114
    173.234.92.166
    222.124.19.34
    173.208.51.26
    195.221.21.235
    200.216.186.42
    109.169.63.25
    173.208.71.168
    212.20.230.84
    64.182.124.219
    92.247.12.242
    210.101.131.231
    65.202.152.252
    78.169.34.83
    12.96.205.18
    180.241.250.39
    192.251.226.205
    200.165.90.210
    189.16.123.100
    174.142.104.57
    173.234.151.15
    173.234.48.57
    173.234.54.48
    222.127.148.210
    189.84.61.130
    173.224.209.100
    195.46.235.18
    208.115.221.178
    211.43.152.55
    211.43.152.54
    211.43.152.57
    211.43.152.49
    201.77.182.103
    67.202.108.170
    110.137.76.170
    189.75.119.10
    60.251.54.208
    61.19.127.212
    189.85.22.242
    74.82.176.137
    89.28.64.114
    173.234.12.166
    173.234.30.87
    190.12.2.174
    173.208.100.192
    38.96.193.74
    66.232.112.91
    217.10.246.2
    196.29.161.84
    201.76.211.246
    189.85.22.242
    61.19.127.212
    60.251.54.208
    189.75.119.10
    110.137.76.170
    67.202.108.170
    201.77.182.103
    189.6.168.62
    173.208.124.9
    118.97.67.134
    187.48.52.241
    69.71.222.186
    173.208.50.43
    189.126.63.85
    219.93.178.162
    189.17.16.130
    163.180.20.183
    173.203.112.170
    95.66.4.1
    173.234.167.188
    190.145.77.34
    72.32.182.210
    163.180.20.183
    67.159.178.199
    81.9.97.45
    220.225.219.165
    173.234.54.190
    189.6.168.62
    186.42.121.2
    118.96.146.90
    173.208.124.9
    118.97.64.88
    222.165.130.214
    174.34.169.210
    122.181.17.54
    222.165.130.214
    118.97.64.88
    81.18.116.66
    111.68.103.62
    89.148.238.87
    78.189.147.58
    200.94.71.73
    173.203.108.236
    61.219.80.80
    200.94.71.73
    71.230.128.156
    211.23.82.90
    189.114.58.245
    173.234.19.236
    24.111.190.251
    203.99.193.132
    202.43.180.146
    187.111.9.134
    189.83.234.9
    61.162.174.209
    190.202.124.18
    190.8.111.59
    210.246.92.3
    89.27.55.28
    94.237.74.250
    88.112.50.149
    77.109.196.243
    149.6.118.94
    50.16.63.173
    200.117.239.246
    188.36.197.28
    84.0.224.99
    222.124.178.98
    189.63.138.110
    222.124.8.13
    173.234.166.207
    190.109.169.176
    220.227.247.178
    119.110.97.28
    85.12.68.98
    189.85.60.18
    118.98.232.50
    119.110.97.28
    187.1.11.218
    18.181.2.157
    189.77.29.29
    27.131.172.9
    209.203.19.2
    187.111.1.194
    78.140.206.22
    203.172.212.2
    79.98.31.241
    202.28.66.115
    62.10.53.132

    there might be a couple of duplicates but its ok

  • 25 Best AWK Commands / Tricks

    by

    AWK is a data driven programming language designed for processing text-based data, either in files or data streams. It is an example of a programming language that extensively uses the string datatype, associative arrays (that is, arrays indexed by key strings), and regular expressions. WIKI

    Here are the most Kick ass voted AWK commands.

    1)  List of commands you use most often

    history | awk ‘{a[$2]++}END{for(i in a){print a[i] ” ” i}}’ | sort -rn | head

    2) Display a block of text with AWK

    awk ‘/start_pattern/,/stop_pattern/’ file.txt

    I find this terribly useful for grepping through a file, looking for just a block of text. There’s “grep -A # pattern file.txt” to see a specific number of lines following your pattern, but what if you want to see the whole block? Say, the output of “dmidecode” (as root):

    dmidecode | awk '/Battery/,/^$/'Will show me everything following the battery block up to the next block of text. Again, I find this extremely useful when I want to see whole blocks of text based on a pattern, and I don’t care to see the rest of the data in output. This could be used against the ‘/etc/securetty/user’ file on Unix to find the block of a specific user. It could be used against VirtualHosts or Directories on Apache to find specific definitions. The scenarios go on for any text formatted in a block fashion. Very handy.

    3) Graph # of connections for each hosts.

    netstat -an | grep ESTABLISHED | awk ‘{print $5}’ | awk -F: ‘{print $1}’ | sort | uniq -c | awk ‘{ printf(“%s\t%s\t”,$2,$1) ; for (i = 0; i < $1; i++) {printf(“*”)}; print “” }’

    Written for linux, the real example is how to produce ascii text graphs based on a numeric value (anything where uniq -c is useful is a good candidate).

    4) Check your unread Gmail from the command line

    curl -u username:password –silent “https://mail.google.com/mail/feed/atom” | tr -d ‘\n’ | awk -F ” ‘{for (i=2; i<=NF; i++) {print $i}}’ | sed -n “s/\(.*\)<\/title.*name>\(.*\)<\/name>.*/\2 – \1/p”

    Checks the Gmail ATOM feed for your account, parses it and outputs a list of unread messages.

    For some reason sed gets stuck on OS X, so here’s a Perl version for the Mac:

    curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*<name>(.*)<\/name>.*$/$2 - $1/'If you want to see the name of the last person, who added a message to the conversation, change the greediness of the operators like this:

    curl -u username:password --silent "https://mail.google.com/mail/feed/atom" | tr -d '\n' | awk -F '<entry>' '{for (i=2; i<=NF; i++) {print $i}}' | perl -pe 's/^<title>(.*)<\/title>.*?<name>(.*?)<\/name>.*$/$2 - $1/'

    5) Remove duplicate entries in a file without sorting.

    awk ‘!x[$0]++’ <file>

    Using awk, find duplicates in a file without sorting, which reorders the contents. awk will not reorder them, and still find and remove duplicates which you can then redirect into another file.

    6) find geographical location of an ip address

    lynx -dump http://www.ip-adress.com/ip_tracer/?QRY=$1|grep address|egrep ‘city|state|country’|awk ‘{print $3,$4,$5,$6,$7,$8}’|sed ‘s\ip address flag \\’|sed ‘s\My\\’

    I save this to bin/iptrace and run “iptrace ipaddress” to get the Country, City and State of an ip address using the http://ipadress.com service.

    I add the following to my script to get a tinyurl of the map as well:

    URL=`lynx -dump http://www.ip-adress.com/ip_tracer/?QRY=$1|grep details|awk ‘{print $2}’`

    lynx -dump http://tinyurl.com/create.php?url=$URL|grep tinyurl|grep “19. http”|awk ‘{print $2}’

    7) Block known dirty hosts from reaching your machine

    wget -qO – http://infiltrated.net/blacklisted|awk ‘!/#|[a-z]/&&/./{print “iptables -A INPUT -s “$1″ -j DROP”}’

    Blacklisted is a compiled list of all known dirty hosts (botnets, spammers, bruteforcers, etc.) which is updated on an hourly basis. This command will get the list and create the rules for you, if you want them automatically blocked, append |sh to the end of the command line. It’s a more practical solution to block all and allow in specifics however, there are many who don’t or can’t do this which is where this script will come in handy. For those using ipfw, a quick fix would be {print “add deny ip from “$1” to any}. Posted in the sample output are the top two entries. Be advised the blacklisted file itself filters out RFC1918 addresses (10.x.x.x, 172.16-31.x.x, 192.168.x.x) however, it is advisable you check/parse the list before you implement the rules

    8) Display a list of committers sorted by the frequency of commits

    svn log -q|grep “|”|awk “{print \$3}”|sort|uniq -c|sort -nr

    Use this command to find out a list of committers sorted by the frequency of commits.

    9) List the number and type of active network connections

    netstat -ant | awk ‘{print $NF}’ | grep -v ‘[a-z]’ | sort | uniq -c

    10) View facebook friend list [hidden or not hidden]

    lynx -useragent=Opera -dump ‘http://www.facebook.com/ajax/typeahead_friends.php?u=4&__a=1’ |gawk -F’\”t\”:\”‘ -v RS=’\”,’ ‘RT{print $NF}’ |grep -v ‘\”n\”:\”‘ |cut -d, -f2

    There’s no need to be logged in facebook. I could do more JSON filtering but you get the idea…

    Replace u=4 (Mark Zuckerberg, Facebook creator) with desired uid.

    Hidden or not hidden… Scary, don’t you?

    11) List recorded formular fields of Firefox

    cd ~/.mozilla/firefox/ && sqlite3 `cat profiles.ini | grep Path | awk -F= ‘{print $2}’`/formhistory.sqlite “select * from moz_formhistory” && cd – > /dev/null

    When you fill a formular with Firefox, you see things you entered in previous formulars with same field names. This command list everything Firefox has registered. Using a “delete from”, you can remove anoying Google queries, for example ;-)

    12) Brute force discover

    sudo zcat /var/log/auth.log.*.gz | awk ‘/Failed password/&&!/for invalid user/{a[$9]++}/Failed password for invalid user/{a[“*” $11]++}END{for (i in a) printf “%6s\t%s\n”, a[i], i|”sort -n”}’

    Show the number of failed tries of login per account. If the user does not exist it is marked with *.

    13) Show biggest files/directories, biggest first with ‘k,m,g’ eyecandy

    du –max-depth=1 | sort -r -n | awk ‘{split(“k m g”,v); s=1; while($1>1024){$1/=1024; s++} print int($1)” “v[s]”\t”$2}’

    I use this on debian testing, works like the other sorted du variants, but i like small numbers and suffixes :)

    14) Analyse an Apache access log for the most common IP addresses

    tail -10000 access_log | awk ‘{print $1}’ | sort | uniq -c | sort -n | tail

    This uses awk to grab the IP address from each request and then sorts and summarises the top 10

    15) copy working directory and compress it on-the-fly while showing progress

    tar -cf – . | pv -s $(du -sb . | awk ‘{print $1}’) | gzip > out.tgz

    What happens here is we tell tar to create “-c” an archive of all files in current dir “.” (recursively) and output the data to stdout “-f -“. Next we specify the size “-s” to pv of all files in current dir. The “du -sb . | awk ?{print $1}?” returns number of bytes in current dir, and it gets fed as “-s” parameter to pv. Next we gzip the whole content and output the result to out.tgz file. This way “pv” knows how much data is still left to be processed and shows us that it will take yet another 4 mins 49 secs to finish.

    Credit: Peteris Krumins http://www.catonmat.net/blog/unix-utilities-pipe-viewer/

    16) List of commands you use most often

    history | awk ‘{print $2}’ | sort | uniq -c | sort -rn | head

    17) Identify long lines in a file

    awk ‘length>72’ file

    This command displays a list of lines that are longer than 72 characters. I use this command to identify those lines in my scripts and cut them short the way I like it.

    18) Makes you look busy

    alias busy=’my_file=$(find /usr/include -type f | sort -R | head -n 1); my_len=$(wc -l $my_file | awk “{print $1}”); let “r = $RANDOM % $my_len” 2>/dev/null; vim +$r $my_file’

    This makes an alias for a command named ‘busy’. The ‘busy’ command opens a random file in /usr/include to a random line with vim. Drop this in your .bash_aliases and make sure that file is initialized in your .bashrc.

    19) Show me a histogram of the busiest minutes in a log file:

    cat /var/log/secure.log | awk ‘{print substr($0,0,12)}’ | uniq -c | sort -nr | awk ‘{printf(“\n%s “,$0) ; for (i = 0; i<$1 ; i++) {printf(“*”)};}’

    20) Analyze awk fields

    awk ‘{print NR”: “$0; for(i=1;i<=NF;++i)print “\t”i”: “$i}’

    Breaks down and numbers each line and it’s fields. This is really useful when you are going to parse something with awk but aren’t sure exactly where to start.

    21) Browse system RAM in a human readable form

    sudo cat /proc/kcore | strings | awk ‘length > 20’ | less

    This command lets you see and scroll through all of the strings that are stored in the RAM at any given time. Press space bar to scroll through to see more pages (or use the arrow keys etc).

    Sometimes if you don’t save that file that you were working on or want to get back something you closed it can be found floating around in here!

    The awk command only shows lines that are longer than 20 characters (to avoid seeing lots of junk that probably isn’t “human readable”).

    If you want to dump the whole thing to a file replace the final ‘| less’ with ‘> memorydump’. This is great for searching through many times (and with the added bonus that it doesn’t overwrite any memory…).

    Here’s a neat example to show up conversations that were had in pidgin (will probably work after it has been closed)…

    sudo cat /proc/kcore | strings | grep '([0-9]\{2\}:[0-9]\{2\}:[0-9]\{2\})'(depending on sudo settings it might be best to run

    sudo sufirst to get to a # prompt)

    22) Monitor open connections for httpd including listen, count and sort it per IP

    watch “netstat -plan|grep :80|awk {‘print \$5’} | cut -d: -f 1 | sort | uniq -c | sort -nk 1”

    It’s not my code, but I found it useful to know how many open connections per request I have on a machine to debug connections without opening another http connection for it.

    You can also decide to sort things out differently then the way it appears in here.

    23) Purge configuration files of removed packages on debian based systems

    sudo aptitude purge `dpkg –get-selections | grep deinstall | awk ‘{print $1}’`

    Purge all configuration files of removed packages

    24) Quick glance at who’s been using your system recently

    last  | grep -v “^$” | awk ‘{ print $1 }’ | sort -nr | uniq -c

    This command takes the output of the ‘last’ command, removes empty lines, gets just the first field ($USERNAME), sort the $USERNAMES in reverse order and then gives a summary count of unique matches.

    25) Number of open connections per ip.

    netstat -ntu | awk ‘{print $5}’ | cut -d: -f1 | sort | uniq -c | sort -n

    Here is a command line to run on your server if you think your server is under attack. It prints our a list of open connections to your server and sorts them by amount.

    BSD Version:

    netstat -na |awk '{print $5}' |cut -d "." -f1,2,3,4 |sort |uniq -c |sort -nr
    And there you have it killer awk usages. Now I know you might be thinking these are NOT awk commands. Maybe not, but awk was used to filter out data.
    Did I make a mistake?,
    did I leave something cool behind?
    Please feel free to comment.
  • 25 Best SSH Commands / Tricks

    by

    OpenSSH is a FREE version of the SSH connectivity tools that technical users of the Internet rely on. Users of telnet, rlogin, and ftp may not realize that their password is transmitted across the Internet unencrypted, but it is. OpenSSH encrypts all traffic (including passwords) to effectively eliminate eavesdropping, connection hijacking, and other attacks.  The encryption that OpenSSH provides has been strong enough to earn the trust of Trend Micro and other providers of cloud computing.Additionally, OpenSSH provides secure tunneling capabilities and several authentication methods, and supports all SSH protocol versions.

    SSH is an awesome powerful tool, there are unlimited possibility when it comes to SSH, heres the top Voted SSH commands

    1) Copy ssh keys to user@host to enable password-less ssh logins.

    ssh-copy-id user@host

    To generate the keys use the command ssh-keygen

    2) Start a tunnel from some machine’s port 80 to your local post 2001

    ssh -N -L2001:localhost:80 somemachine

    Now you can acces the website by going to http://localhost:2001/

    3) Output your microphone to a remote computer’s speaker

    dd if=/dev/dsp | ssh -c arcfour -C username@host dd of=/dev/dsp

    This will output the sound from your microphone port to the ssh target computer’s speaker port. The sound quality is very bad, so you will hear a lot of hissing.

    4) Compare a remote file with a local file

    ssh user@host cat /path/to/remotefile | diff /path/to/localfile –

    Useful for checking if there are differences between local and remote files.

    5) Mount folder/filesystem through SSH

    sshfs name@server:/path/to/folder /path/to/mount/point

    Install SSHFS from http://fuse.sourceforge.net/sshfs.html
    Will allow you to mount a folder security over a network.

    6) SSH connection through host in the middle

    ssh -t reachable_host ssh unreachable_host

    Unreachable_host is unavailable from local network, but it’s available from reachable_host’s network. This command creates a connection to unreachable_host through “hidden” connection to reachable_host.

    7) Copy from host1 to host2, through your host

    ssh root@host1 “cd /somedir/tocopy/ && tar -cf – .” | ssh root@host2 “cd /samedir/tocopyto/ && tar -xf -“

    Good if only you have access to host1 and host2, but they have no access to your host (so ncat won’t work) and they have no direct access to each other.

     

    8) Run any GUI program remotely

     

    ssh -fX <user>@<host> <program>

    The SSH server configuration requires:

    X11Forwarding yes # this is default in Debian

    And it’s convenient too:

    Compression delayed

    9) Create a persistent connection to a machine

    ssh -MNf <user>@<host>

    Create a persistent SSH connection to the host in the background. Combine this with settings in your ~/.ssh/config:
    Host host
    ControlPath ~/.ssh/master-%r@%h:%p
    ControlMaster no
    All the SSH connections to the machine will then go through the persisten SSH socket. This is very useful if you are using SSH to synchronize files (using rsync/sftp/cvs/svn) on a regular basis because it won’t create a new socket each time to open an ssh connection.

    10) Attach screen over ssh

    ssh -t remote_host screen -r

    Directly attach a remote screen session (saves a useless parent bash process)

    11) Port Knocking!

    knock <host> 3000 4000 5000 && ssh -p <port> user@host && knock <host> 5000 4000 3000

    Knock on ports to open a port to a service (ssh for example) and knock again to close the port. You have to install knockd.
    See example config file below.
    [options]
    logfile = /var/log/knockd.log
    [openSSH]
    sequence = 3000,4000,5000
    seq_timeout = 5
    command = /sbin/iptables -A INPUT -i eth0 -s %IP% -p tcp –dport 22 -j ACCEPT
    tcpflags = syn
    [closeSSH]
    sequence = 5000,4000,3000
    seq_timeout = 5
    command = /sbin/iptables -D INPUT -i eth0 -s %IP% -p tcp –dport 22 -j ACCEPT
    tcpflags = syn

    12) Remove a line in a text file. Useful to fix

    ssh-keygen -R <the_offending_host>

    In this case it’s better do to use the dedicated tool

    13) Run complex remote shell cmds over ssh, without escaping quotes

    ssh host -l user $(<cmd.txt)

    Much simpler method. More portable version: ssh host -l user “`cat cmd.txt`”

    14) Copy a MySQL Database to a new Server via SSH with one command

    mysqldump –add-drop-table –extended-insert –force –log-error=error.log -uUSER -pPASS OLD_DB_NAME | ssh -C user@newhost “mysql -uUSER -pPASS NEW_DB_NAME”

    Dumps a MySQL database over a compressed SSH tunnel and uses it as input to mysql – i think that is the fastest and best way to migrate a DB to a new server!

    15) Remove a line in a text file. Useful to fix “ssh host key change” warnings

    sed -i 8d ~/.ssh/known_hosts

    16) Copy your ssh public key to a server from a machine that doesn’t have ssh-copy-id

    cat ~/.ssh/id_rsa.pub | ssh user@machine “mkdir ~/.ssh; cat >> ~/.ssh/authorized_keys”

    If you use Mac OS X or some other *nix variant that doesn’t come with ssh-copy-id, this one-liner will allow you to add your public key to a remote machine so you can subsequently ssh to that machine without a password.

    17) Live ssh network throughput test

    yes | pv | ssh $host “cat > /dev/null”

    connects to host via ssh and displays the live transfer speed, directing all transferred data to /dev/null
    needs pv installed
    Debian: ‘apt-get install pv’
    Fedora: ‘yum install pv’ (may need the ‘extras’ repository enabled)

    18) How to establish a remote Gnu screen session that you can re-connect to

    ssh -t user@some.domain.com /usr/bin/screen -xRR

    Long before tabbed terminals existed, people have been using Gnu screen to open many shells in a single text terminal. Combined with ssh, it gives you the ability to have many open shells with a single remote connection using the above options. If you detach with “Ctrl-a d” or if the ssh session is accidentally terminated, all processes running in your remote shells remain undisturbed, ready for you to reconnect. Other useful screen commands are “Ctrl-a c” (open new shell) and “Ctrl-a a” (alternate between shells). Read this quick reference for more screen commands: http://aperiodic.net/screen/quick_reference

    19) Resume scp of a big file

    rsync –partial –progress –rsh=ssh $file_source $user@$host:$destination_file

    It can resume a failed secure copy ( usefull when you transfer big files like db dumps through vpn ) using rsync.
    It requires rsync installed in both hosts.
    rsync –partial –progress –rsh=ssh $file_source $user@$host:$destination_file local -> remote
    or
    rsync –partial –progress –rsh=ssh $user@$host:$remote_file $destination_file remote -> local

    20) Analyze traffic remotely over ssh w/ wireshark

    ssh root@server.com ‘tshark -f “port !22” -w -‘ | wireshark -k -i –

    This captures traffic on a remote machine with tshark, sends the raw pcap data over the ssh link, and displays it in wireshark. Hitting ctrl+C will stop the capture and unfortunately close your wireshark window. This can be worked-around by passing -c # to tshark to only capture a certain # of packets, or redirecting the data through a named pipe rather than piping directly from ssh to wireshark. I recommend filtering as much as you can in the tshark command to conserve bandwidth. tshark can be replaced with tcpdump thusly:
    ssh root@example.com tcpdump -w – ‘port !22’ | wireshark -k -i –

    21) Have an ssh session open forever

    autossh -M50000 -t server.example.com ‘screen -raAd mysession’

    Open a ssh session opened forever, great on laptops losing Internet connectivity when switching WIFI spots.

    22) Harder, Faster, Stronger SSH clients

    ssh -4 -C -c blowfish-cbc

    We force IPv4, compress the stream, specify the cypher stream to be Blowfish. I suppose you could use aes256-ctr as well for cypher spec. I’m of course leaving out things like master control sessions and such as that may not be available on your shell although that would speed things up as well.

    23) Throttle bandwidth with cstream

    tar -cj /backup | cstream -t 777k | ssh host ‘tar -xj -C /backup’

    this bzips a folder and transfers it over the network to “host” at 777k bit/s.
    cstream can do a lot more, have a look http://www.cons.org/cracauer/cstream.html#usage
    for example:
    echo w00t, i’m 733+ | cstream -b1 -t2

    24) Transfer SSH public key to another machine in one step

    ssh-keygen; ssh-copy-id user@host; ssh user@host

    This command sequence allows simple setup of (gasp!) password-less SSH logins. Be careful, as if you already have an SSH keypair in your ~/.ssh directory on the local machine, there is a possibility ssh-keygen may overwrite them. ssh-copy-id copies the public key to the remote host and appends it to the remote account’s ~/.ssh/authorized_keys file. When trying ssh, if you used no passphrase for your key, the remote shell appears soon after invoking ssh user@host.

    25) Copy stdin to your X11 buffer

    ssh user@host cat /path/to/some/file | xclip

    Have you ever had to scp a file to your work machine in order to copy its contents to a mail? xclip can help you with that. It copies its stdin to the X11 buffer, so all you have to do is middle-click to paste the content of that looong file :)

    Have Fun

    Please comment if you have any other good SSH Commands OR Tricks.

    <user>@<host>
Chat

Hi 👋, how can we help?