This article is part of the article series "Awk One-Liners Explained."
<- previous article next article ->
awk programming one-liners explained

I noticed that Eric Wendelin wrote an article "awk is a beautiful tool." In this article he said that it was best to introduce Awk with practical examples. I totally agree with Eric.

When I was learning Awk, I first went through Awk - A Tutorial and Introduction by Bruce Barnett, which was full of examples to try out; then I created an Awk cheat sheet to have the language reference in front of me; and finally I went through the famous Awk one-liners (link to .txt file), which were compiled by Eric Pement.

This is going to be a three-part article in which I will explain every single one-liner in Mr. Pement's compilation. Each part will explain around 20 one-liners. If you follow closely then the explained examples will turn you into a great Awk programmer.

Eric Pement's Awk one-liner collection consists of five sections:

The first part of the article will explain the first two sections: "File spacing" and "Numbering and calculations." The second part will explain "Text conversion and substitution", and the last part "Selective printing/deleting of certain lines."

I recommend that you print out my Awk cheat sheet before you proceed. This way you will have the language reference in front of you, and you will memorize things better.

These one-liners work with all versions of awk, such as nawk (AT&T's new awk), gawk (GNU's awk), mawk (Michael Brennan's awk) and oawk (old awk).

Awesome news: I have written an e-book based on this article series. Check it out:

Let's start!

1. Line Spacing

1. Double-space a file.

awk '1; { print "" }'

So how does it work? A one-liner is an Awk program and every Awk program consists of a sequence of pattern-action statements "pattern { action statements }". In this case there are two statements "1" and "{ print "" }". In a pattern-action statement either the pattern or the action may be missing. If the pattern is missing, the action is applied to every single line of input. A missing action is equivalent to '{ print }'. Thus, this one-liner translates to:

awk '1 { print } { print "" }'

An action is applied only if the pattern matches, i.e., pattern is true. Since '1' is always true, this one-liner translates further into two print statements:

awk '{ print } { print "" }'

Every print statement in Awk is silently followed by an ORS - Output Record Separator variable, which is a newline by default. The first print statement with no arguments is equivalent to "print $0", where $0 is a variable holding the entire line. The second print statement prints nothing, but knowing that each print statement is followed by ORS, it actually prints a newline. So there we have it, each line gets double-spaced.

2. Another way to double-space a file.

awk 'BEGIN { ORS="\n\n" }; 1'

BEGIN is a special kind of pattern which is not tested against the input. It is executed before any input is read. This one-liner double-spaces the file by setting the ORS variable to two newlines. As I mentioned previously, statement "1" gets translated to "{ print }", and every print statement gets terminated with the value of ORS variable.

3. Double-space a file so that no more than one blank line appears between lines of text.

awk 'NF { print $0 "\n" }'

The one-liner uses another special variable called NF - Number of Fields. It contains the number of fields the current line was split into. For example, a line "this is a test" splits in four pieces and NF gets set to 4. The empty line "" does not split into any pieces and NF gets set to 0. Using NF as a pattern can effectively filter out empty lines. This one liner says: "If there are any number of fields, print the whole line followed by newline."

4. Triple-space a file.

awk '1; { print "\n" }'

This one-liner is very similar to previous ones. '1' gets translated into '{ print }' and the resulting Awk program is:

awk '{ print; print "\n" }'

It prints the line, then prints a newline followed by terminating ORS, which is newline by default.

2. Numbering and Calculations

5. Number lines in each file separately.

awk '{ print FNR "\t" $0 }'

This Awk program appends the FNR - File Line Number predefined variable and a tab (\t) before each line. FNR variable contains the current line for each file separately. For example, if this one-liner was called on two files, one containing 10 lines, and the other 12, it would number lines in the first file from 1 to 10, and then resume numbering from one for the second file and number lines in this file from 1 to 12. FNR gets reset from file to file.

6. Number lines for all files together.

awk '{ print NR "\t" $0 }'

This one works the same as #5 except that it uses NR - Line Number variable, which does not get reset from file to file. It counts the input lines seen so far. For example, if it was called on the same two files with 10 and 12 lines, it would number the lines from 1 to 22 (10 + 12).

7. Number lines in a fancy manner.

awk '{ printf("%5d : %s\n", NR, $0) }'

This one-liner uses printf() function to number lines in a custom format. It takes format parameter just like a regular printf() function. Note that ORS does not get appended at the end of printf(), so we have to print the newline (\n) character explicitly. This one right-aligns line numbers, followed by a space and a colon, and the line.

8. Number only non-blank lines in files.

awk 'NF { $0=++a " :" $0 }; { print }'

Awk variables are dynamic; they come into existence when they are first used. This one-liner pre-increments variable 'a' each time the line is non-empty, then it appends the value of this variable to the beginning of line and prints it out.

9. Count lines in files (emulates wc -l).

awk 'END { print NR }'

END is another special kind of pattern which is not tested against the input. It is executed when all the input has been exhausted. This one-liner outputs the value of NR special variable after all the input has been consumed. NR contains total number of lines seen (= number of lines in the file).

10. Print the sum of fields in every line.

awk '{ s = 0; for (i = 1; i <= NF; i++) s = s+$i; print s }'

Awk has some features of C language, like the for (;;) { ... } loop. This one-liner loops over all fields in a line (there are NF fields in a line), and adds the result in variable 's'. Then it prints the result out and proceeds to the next line.

11. Print the sum of fields in all lines.

awk '{ for (i = 1; i <= NF; i++) s = s+$i }; END { print s+0 }'

This one-liner is basically the same as #10, except that it prints the sum of all fields. Notice how it did not initialize variable 's' to 0. It was not necessary as variables come into existence dynamically. Also notice how it calls "print s+0" and not just "print s". It is necessary if there are no fields. If there are no fields, "s" never comes into existence and is undefined. Printing an undefined value does not print anything (i.e. prints just the ORS). Adding a 0 does a mathematical operation and undef+0 = 0, so it prints "0".

12. Replace every field by its absolute value.

awk '{ for (i = 1; i <= NF; i++) if ($i < 0) $i = -$i; print }'

This one-liner uses two other features of C language, namely the if (...) { ... } statement and omission of curly braces. It loops over all fields in a line and checks if any of the fields is less than 0. If any of the fields is less than 0, then it just negates the field to make it positive. Fields can be addresses indirectly by a variable. For example, i = 5; $i = 'hello', sets field number 5 to string 'hello'.

Here is the same one-liner rewritten with curly braces for clarity. The 'print' statement gets executed after all the fields in the line have been replaced by their absolute values.

awk '{
  for (i = 1; i <= NF; i++) {
    if ($i < 0) {
      $i = -$i;
    }
  }
  print
}'

13. Count the total number of fields (words) in a file.

awk '{ total = total + NF }; END { print total+0 }'

This one-liner matches all the lines and keeps adding the number of fields in each line. The number of fields seen so far is kept in a variable named 'total'. Once the input has been processed, special pattern 'END { ... }' is executed, which prints the total number of fields. See 11th one-liner for explanation of why we "print total+0" in the END block.

14. Print the total number of lines containing word "Beth".

awk '/Beth/ { n++ }; END { print n+0 }'

This one-liner has two pattern-action statements. The first one is '/Beth/ { n++ }'. A pattern between two slashes is a regular expression. It matches all lines containing pattern "Beth" (not necessarily the word "Beth", it could as well be "Bethe" or "theBeth333"). When a line matches, variable 'n' gets incremented by one. The second pattern-action statement is 'END { print n+0 }'. It is executed when the file has been processed. Note the '+0' in 'print n+0' statement. It forces '0' to be printed in case there were no matches ('n' was undefined). Had we not put '+0' there, an empty line would have been printed.

15. Find the line containing the largest (numeric) first field.

awk '$1 > max { max=$1; maxline=$0 }; END { print max, maxline }'

This one-liner keeps track of the largest number in the first field (in variable 'max') and the corresponding line (in variable 'maxline'). Once it has looped over all lines, it prints them out. Warning: this one-liner does not work if all the values are negative.

Here is the fix:

awk 'NR == 1 { max = $1; maxline = $0; next; } $1 > max { max=$1; maxline=$0 }; END { print max, maxline }'

16. Print the number of fields in each line, followed by the line.

awk '{ print NF ":" $0 } '

This one-liner just prints out the predefined variable NF - Number of Fields, which contains the number of fields in the line, followed by a colon and the line itself.

17. Print the last field of each line.

awk '{ print $NF }'

Fields in Awk need not be referenced by constants. For example, code like 'f = 3; print $f' would print out the 3rd field. This one-liner prints the field with the value of NF. $NF is last field in the line.

18. Print the last field of the last line.

awk '{ field = $NF }; END { print field }'

This one-liner keeps track of the last field in variable 'field'. Once it has looped all the lines, variable 'field' contains the last field of the last line, and it just prints it out.

Here is a better version of the same one-liner. It's more common, idiomatic and efficient:

awk 'END { print $NF }'

19. Print every line with more than 4 fields.

awk 'NF > 4'

This one-liner omits the action statement. As I noted in one-liner #1, a missing action statement is equivalent to '{ print }'.

20. Print every line where the value of the last field is greater than 4.

awk '$NF > 4'

This one-liner is similar to #17. It references the last field by NF variable. If it's greater than 4, it prints it out.

Awk one-liners explained e-book

I have written my first e-book called "Awk One-Liners Explained". I improved the explanations of the one-liners in this article series, added new one-liners and added three new chapters - introduction to awk one-liners, summary of awk special variables and idiomatic awk. Please take a look:

Have fun!

That's it for Part I one the article. The second part will be on "Text conversion and substitution."

Have fun learning Awk! It's a fun language to know. :)

This article is part of the article series "Awk One-Liners Explained."
<- previous article next article ->

Comments

miguel Permalink
September 28, 2008, 11:11

Nice post. In the comment for item 17, shouldn't it say 'f = 3; print $f' instead of 'f = 3; print $3' ?

September 07, 2013, 20:21

INVELLOP Samsung Galaxy Tab 3 8.0 8inch Crystal Clear HD 3-pack Screen Protectors
http://www.bestcellphonecase.com/other-accessories/invellop-samsung-galaxy-tab-3-8-0-8inch-crystal-clear-hd-3-pack-screen-protectors.html

Johnny White Permalink
September 28, 2008, 13:42

Amazing! Scientists jsut never cease to amaze me!

August 29, 2013, 02:26

EMPIRE Screen Protector for Verizon HTC DROID Incredible 2 By:

September 28, 2008, 18:08

Hehe,

Awesome, if the list was a bit longer it would be a must have for me. Can you extend it a bit? :)

Cheers,

August 30, 2013, 04:48

MiniSuit Starter Kit for Nokia Lumia 520 – Hard Snap-On Rubber Case, Screen Protector, Stylus

September 28, 2008, 18:27

Interesting list.

Here's a useful awk one-liner to kill a hanging Firefox process, including all its parents that are part of Firefox (but not the shell or any "higher" ancestors). This is a real-life example which I wrote and used recently:

UNIX one-liner to kill a hanging Firefox process

- Vasudev

Go_Obama Permalink
September 28, 2008, 21:16

Thanks ! many are exactly what I been lokking for:

Question re #6
So say if I want a new file from the concatenate of
foo.log0,
foo.log1,
foo.log2 (increaing date)

and number all lines of all log files together,

This is the right command ?

awk '{ print NR "\t" $0 }' foo.log* > foo.txt

or should I do:

awk '{ print NR "\t" $0 }' foo.log0 foo.log1 foo.log2 > foo.txt

thanks

goofy_barney Permalink
September 29, 2008, 02:21

#10 correction.

replace

s=s+$i

with

s=s+i

ditto #11

September 29, 2008, 02:46

Awesome list Peteris! These examples are great!

September 29, 2008, 02:57

Miguel, thanks, i fixed that mistake.

Eduard, this is a 3 part article. This is just part one. Parts two and three coming soon.

Goofy_barny, no, it's "s=s+$i". It sums the values in each field.

Go_Obama, I also support Obama :) but for your question, do the awk '{ print NR "\t" $0 }' foo.log0 foo.log1 foo.log2 > foo.txt. The other example will put foo.log11 before foo.log2.

October 02, 2008, 17:40

Nice explanation of all the awk one liners, thanks peter.

//Jadu, unstableme.blogspot.com

August 01, 2013, 16:54

Remington S9520 Salon Collection Ceramic Hair Straightener with Pearl Infused Wide Plates, 2 Inch
http://www.408400.net/tools-and-accessories/hair-tool/remington-s9520-salon-collection-ceramic-hair-straightener-with-pearl-infused-wide-plates-2-inch.html

Sydney Permalink
October 07, 2008, 05:55

Hello, I have been playing with AWK a little and have this line.

df -h | grep / | grep % | awk '$(NF-1) >= "20%" {print $NF,": ", $(NF-1)}'

Everything works great except I have a volume with 3% usage. AWK in my usage of it only appears to evalute the first number of the “20%”.

I have looked around the net some for a resolution to this and read some documentation. I know that it is something simple I am overlooking, but you folks look like you know what you are doing and are active so I ask; What usage of AWK should I implement to make 1 !> 10.

Thanks in advance.
–Sydney

October 07, 2008, 06:20

Sydney, thanks for your question.

I see a couple of mistakes here. First of all you are using grep twice! Awk can do what grep does itself with the /.../ regex pattern matching.

Here is what I came up with (works in GNU Awk only!):

df -h | awk '/dev/ { if (strtonum($5) >= 20) { print $NF ": " $(NF-1) } }'

And this one works in all Awk's:

df -h | awk '/dev/ {
 if (match($5, /^[0-9]+/)) {
  usage = substr($5, RSTART, RLENGTH)
  if (usage >= 20) {
   print $NF ": " $(NF-1)
  }
 }
}'

Hope it helps.

Sydney Permalink
October 07, 2008, 07:23

Thanks so much Peteris,

I am only ever using GNU AWK on Redhat or Ubuntu maybe Debian rarely. So I mutated the first into:

df -h | awk '/\// { if (strtonum($(NF-1)) >= 20) { print $NF ": " $(NF-1) } }'

I am guessing that strtonum function that you suggested takes it out of the string world. I was trying to compare a string to a number.

The RegEx /\// found all mounts.

df output is not constant in Red Hat with long mount names

/shrug

Who knew? So I changed the code to work backwards from the last field.

Thanks a bunch this was an interesting experiment for me that I may turn around and tweak for production.

seven Permalink
August 04, 2010, 06:14

Hi,

Please let me know the command to list files having specific number of rows. I want to list all the files having only 3 rows only.

Regards,

October 07, 2008, 07:31

You're welcome, Sydney.

October 09, 2008, 16:35

Nice clearly written examples. I've bookmarked it for future reference :)

Thanks!

November 08, 2008, 17:45

Great info, I am familiar with Awk but have only used it for very simple field delimiting. I can't wait to read the rest of this series and also your series on Sed.

WangPeng Permalink
November 10, 2008, 15:02

Thank you!
but where is the second part? I can't find it.

Augusto Permalink
February 21, 2009, 01:11

Hi. I've been trying to understand example 8

awk 'NF { $0=++a " :" $0 }; { print }'

but I can't, can anybody explain a little bit more?

Peter, amzing site, great articles. Congrats

February 21, 2009, 01:28

Hey Augusto. Now that I look at one-liner #8, it looks pretty ridiculous. But I didn't write it.

If I wrote it, here is how it would look:

awk '{ print ++a, $0 }'

At every line increment variable a, and output it together with the line itself.

The explanation of the original one liner is this:

Every line gets read in variable $0. The one-liner modifies this $0. It appends the contents of variable 'a' to the beginning of $0. But before appending 'a' it gets incremented by one by ++ unary operator.

kyaw Permalink
April 12, 2011, 06:10

Hi
if code goes like as awk '{ print ++a, $0 }', line numbering included blank lined. But code goes awk 'NF {$0=++a ": " $0}; {print}' that will be add line number to non-blank line.

NF {$0=++a ": " $0} = There must have one field at lease to increase number to var "a" else no action or not take as record for blank line, same time $0 (whole NR ) will become as [var a : ORG_LINE] then print again with all (whole line which newly create include var a value) by print statement after ; .

kyaw Permalink
April 12, 2011, 06:12

Correction : ---
if code goes like as awk '{ print ++a, $0 }', line numbering included blank lined. But code goes awk 'NF {$0=++a ": " $0}; {print}' that will be add line number only to non-blank line. Blank will not get line number.

NF {$0=++a ": " $0} = There must have one field at lease to increase number to var "a" else no action or not take as record for blank line, same time $0 (whole NR ) will become as [var a : ORG_LINE] then print again with all (whole line which newly create include var a value) by print statement after ; .

Paul Permalink
July 29, 2009, 15:31

For Sydney 10/8/2008, who writes:
AWK in my usage of it only appears to evaluate the first number of the “20%”. so I ask; What usage of AWK should I implement to make 1 compare lower than 10.

Unix awk does not have strtonum but you don't need it, nor substr. The problem is that variables are treated both as string and numeric by the comparison operators. Trick is to typecast to numeric using (0 + expression), or to string using ("" expression), to force the right comparison. For a variable $3 like 17%, use x = 0 + $3 to assign the 17 part to x.

knud Permalink
March 25, 2010, 02:22

Your explanation to #17 is needed in #10-#12. I couldn't figure out how the $i value was assigned.

Jimmy Permalink
August 18, 2010, 01:34

Hi Wondering If I could have some help - sorry for double post

I have a column of numbers sorted in ascending order. I am trying to remove the last 5% of the records then count avg sum etc

The total records are : 99183
I only want to sum the first : 94222 (discarding the outliers 5%)

Where the value of 94222 is in the command line is where I want to use the variable NFIVE -

but if I put variable in awk counts all the records and sums no records.

Desired output:

 
cat <file> |tr -s '=' ' '|sort -k5n | awk '{NFIVE=NR*.95}; {if (NR<94222) TOTAL+=$5} END{printf("COUNT:%d, TOTAL:%d,MEAN:%d\n",NFIVE,TOTAL,TOTAL/NFIVE)}'

OUTPUT: (and correct values)
COUNT:94222, TOTAL:19079403, MEAN:202

Incorrect values I get if using NFIVE
EG:

cat <FILE> |tr -s '=' ' '|sort -k5n | awk '{NFIVE=NR*.95}; {if (NR<NFIVE) TOTAL+=$5} END{printf("COUNT:%d, TOTAL:%d, MEAN:%d\n",NFIVE,TOTAL,TOTAL/NFIVE)}'

COUNT:94222, TOTAL:0, MEAN:0

Thanks for any assitance

January 07, 2011, 18:51

How do you print in awk w/out new line char?
e.g. : chkconfig --list | awk '{print $1}'
service1
service2
service3

INSTEAD I want:
service1, service2, service3

Kyaw Permalink
March 14, 2011, 03:55

chkconfig --list | awk 'BEGIN { ORS=" ,"}; {print $1}'

Surya Permalink
January 26, 2012, 17:38

Awesome series!
I recently started playing with sed and awk and this series helped me a lot. Thanks a bunch Peteris :)

pandeeswaran Permalink
February 05, 2012, 15:57

simple awk trick fr Triple spacing a file

awk '{ORS="\n\n\n"}1'

zzo38 Permalink
March 10, 2012, 00:19

Here is a code to unlit bird-style programs:

awk 'sub(/^>/," ")||($0=" ")'

daniel Permalink
March 14, 2012, 05:04

Great articles!
However I tried this in my workspace:

find . -name *.java|xargs awk 'END {print NR}'

hoping to get the total line number of all java files...

The output I got was:

334810
290871
272952
243138
247081

I don't know why it got multiple numbers

and when I tried wc -l, the total line number was:246911...
any idea?

Gaurav Permalink
May 26, 2012, 07:23

Hi danial use this it will give u correct count

find . -name *.java| awk 'END {print NR}'

when you are using xargs it is taking the count of all content present in side a particular file, that's why it is giving the count of content present in files

April 09, 2012, 21:37

Hah! Nice stuff, this awk. Thx to them examples, here's what i came up with to monitor what frequency my quad-core CPU currently runs at: cpu-freq-average.sh (lol)
> #!/bin/bash
> awk '{sum+=$1}; END {printf "∅%.2fGHz\n", sum/NR/10**6}' \
> <(cat /sys/devices/system/cpu/cpu*/cpufreq/cpuinfo_cur_freq)
integrates nicely into a tmux status bar - adn works for any number of CPUs :D

April 09, 2012, 21:41

oha, code block0rz^^ sorry

#!/bin/bash
awk '{sum+=$1}; END {printf "∅%.2fGHz\n", sum/NR/10**6}' \
 <(cat /sys/devices/system/cpu/cpu*/cpufreq/cpuinfo_cur_freq)
October 11, 2012, 21:47

Hi,

Thought you might be interested in seeing my expanded version of your 'greater than' example #20 which I'm using for practical work purposes:

Print every record from an invoice where the value in field 7 (date pre-formatted by sed as YYYYmmdd) falls on or after April 1, 2011 and is before October 1, 2011.

awk '$7 >= 20110401 && $7 < 20111001'

Cool, I did not realize you could do this with awk! Your example has saved me looping through thousands of sales records and comparing date stamps against the range I needed!

John

Jason Permalink
January 25, 2013, 22:41

Hi guys, someone can help me?
I had to write an awk script that count the number of single a in a file, and counts the number of lines where there are that a...

I wrote only this that give me the total number of a, but I don't know how to find total number of line containing this character:
awk '{for(i=0;i<NF;i++) if ($i=="a") n++} {print "Tot # of a is "n}' filename

John Eric Permalink
May 09, 2013, 19:41

I'm looking for a way to output FIND in such a way to print a file with fields: "date time size

<tab>

file_path". I'm new to linux and my 6TB RAID is still in Windows NTFS. Filenames have spaces and classical music files can have very long names.

This is what I have so far, but awk prints the output of find on 2 lines, instead of 1.

find /mnt/Drive-D/ -type f -exec ls -ld --time-style=long-iso {} \; | awk '{print $6,$7,$5,"\t",$1=$2=$3=$4=$5=$6=$7=""; print $0}'

July 09, 2013, 19:02

professional makeup kit Cameo Carry All Beauty Case by Shany © 100pc Pro Make Up Set – Premium Collection
http://www.408400.net/tools-and-accessories/makeup-brushes-tools/professional-makeup-kit-cameo-carry-all-beauty-case-by-shany-100pc-pro-make-up-set-premium-collection.html

noellerc10 Permalink
October 02, 2013, 06:24

I'm probing for this Lycium Barbarum for weight-loss management; when I came across this particular blog posted in your web-sites. I did adore it and I'm looking forward in going back again.

October 03, 2013, 15:42

Well! thanks for nice share! when you are using xargs it is taking the count of all content present in side a particular file, that's why it is giving the count of content present in files. For Minneapolis web design company or web design quote, please call: 612-590-8080.

James BOnd Permalink
November 11, 2013, 04:58

Students and other programming professionals dealing with coding, this is very helpful post to learn the functions and actions of awk one-liners. Now they can prepare their assignment by getting custom writing services which are available online any time at low cost.

Anna Maria Permalink
November 11, 2013, 15:26

from rental mobil jakarta ,
Very nice, thanks for sharing this Awk Programming for Calculation.

April 05, 2014, 06:14

Yeah i agree withcha statemant sewa mobil jakarta so i wanna say thank so much

German Danley Permalink
December 14, 2013, 07:11

Being all alone here in the inn gives me pain in the neck; with the use of my Tablet I check this website to read more good stuff. And in fact, I found Famous Awk One-Liners Explained, Part I: File Spacing, Numbering and Calculations worthwhile of my time reading through.

Cesar Azevedo Permalink
December 19, 2013, 07:14

I'm not really usually stimulated by educational content, yet your take on Famous Awk One-Liners Explained, Part I: File Spacing, Numbering and Calculations honestly made me contemplate your viewpoints. You have definitely presented important and reliable views that happen to be logical and appealing. I personally love to write about http://silverlakestudiola.com. So I am conscious of the effort and hard work it takes to create an informational article like this. Thanks for featuring your high-quality work.

peggyd Permalink
December 28, 2013, 09:21

This article will be very helpful for the students.Today, students are make use of online articles for their study. Now students are having a habit of using essay writing services. Most of the essay writing companies online have a team of highly professional essay writers they are able to provide the best essays.

February 07, 2014, 23:43

thank's for your information and i like this post because very nice
do you might also like my ite into

February 07, 2014, 23:45

i like this ^___^

February 08, 2014, 15:31

Get started creating web pages with text files & HTML code! This critique is a prolongation of HTML Explained: Part 1, which gives a wide-ranging overview of HTML.

February 08, 2014, 16:09

I gree with you :)

February 09, 2014, 23:37

yes you right

March 01, 2014, 04:30

very impresife i learn this post to my hp but untul now i must larn again more

if i think your blog is very useful i can develop it then

Yuri Permalink
March 04, 2014, 08:03

Finding quality information about this specific subject that's clear and fascinating is hard.Best Gaming CPU I just wanted to inform you it is a wonderful unique post and I also accept many of the viewpoints you've mentioned.

March 09, 2014, 20:09

very impresife i learn this post to my hp but untul now i must larn again more

March 09, 2014, 20:09

Very nice, thanks for sharing this Awk Programming for Calculation.

March 15, 2014, 11:36

Eskişehir'in en kaliteli escort bayanları sitemizde yer almaktadır. ve En iyi seksi bayanlar her zaman Eskişehir İlan Sitemizde bulunmaktadır.

March 15, 2014, 11:37

Vip Eskişehir escort hizmeti sunan profesyonel bir web sitesidir

March 15, 2014, 11:38

Ankaranın kaliteli elit escort bayanları burada

March 15, 2014, 11:38

Bursa Şehrine özel escort bayan ve kızlar

April 05, 2014, 06:18

yes me too think that bro

April 11, 2014, 16:28

the information you present on this website is very useful. can be used as good references.

Leave a new comment

(why do I need your e-mail?)

(Your twitter name, if you have one. (I'm @pkrumins, btw.))

Type the first letter of your name: (just to make sure you're a human)

Please preview the comment before submitting to make sure it's OK.

Advertisements