Now Reading
awk introduction – CLI textual content processing with GNU awk

awk introduction – CLI textual content processing with GNU awk

2023-08-28 01:02:36

This chapter will give an outline of awk syntax and a few examples to point out what sort of issues you can clear up utilizing awk. These options shall be lined in depth in later, however you should not skip this chapter.

awk supplies filtering capabilities like these supported by the grep and sed instructions. As a programming language, there are extra nifty options as nicely. Much like many command line utilities, awk can settle for enter from each stdin and information.

# pattern stdin information
$ printf 'gatenapplenwhatnkiten'
gate
apple
what
kite

# identical as: grep 'at' and sed -n '/at/p'
# filter strains containing 'at'
$ printf 'gatenapplenwhatnkiten' | awk '/at/'
gate
what

# identical as: grep -v 'e' and sed -n '/e/!p'
# filter strains NOT containing 'e'
$ printf 'gatenapplenwhatnkiten' | awk '!/e/'
what

By default, awk routinely loops over the enter content material line by line. You’ll be able to then use programming directions to course of these strains. As awk is usually used from the command line, many shortcuts can be found to cut back the quantity of typing wanted.

Within the above examples, a daily expression (outlined by the sample between a pair of ahead slashes) has been used to filter the enter. Common expressions (regexp) shall be lined intimately within the next chapter. String values with none particular regexp characters are used on this chapter. The complete syntax is string ~ /regexp/ to verify if the given string matches the regexp and string !~ /regexp/ to verify if does not match. When the string is not specified, the check is carried out in opposition to a particular variable $0, which has the contents of the enter line. The proper time period can be enter file, however that is a dialogue for a later chapter.

Additionally, within the above examples, solely the filtering situation was given. By default, when the situation evaluates to true, the contents of $0 is printed. Thus:

  • awk '/regexp/' is a shortcut for awk '$0 ~ /regexp/{print $0}'
  • awk '!/regexp/' is a shortcut for awk '$0 !~ /regexp/{print $0}'
# identical as: awk '/at/'
$ printf 'gatenapplenwhatnkiten' | awk '$0 ~ /at/{print $0}'
gate
what

# identical as: awk '!/e/'
$ printf 'gatenapplenwhatnkiten' | awk '$0 !~ /e/{print $0}'
what

Within the above examples, {} is used to specify a block of code to be executed when the situation that precedes the block evaluates to true. A number of statements may be given separated by the ; character. You may see such examples and study extra about awk syntax later.

In a conditional expression, non-zero numeric values and non-empty string values are evaluated as true. Idiomatically, 1 is used to indicate a true situation in one-liners as a shortcut to print the contents of $0.

# identical as: printf 'gatenapplenwhatnkiten' | cat
# identical as: awk '{print $0}'
$ printf 'gatenapplenwhatnkiten' | awk '1'
gate
apple
what
kite

awk has three capabilities to cowl search and change necessities. Two of them are proven beneath. The sub operate replaces solely the primary match, whereas the gsub operate replaces all of the matching occurrences. By default, these capabilities function on $0 when the enter string is not supplied. Each sub and gsub modifies the enter supply on profitable substitution.

# for every enter line, change solely the primary ':' to '-'
# identical as: sed 's/:/-/'
$ printf '1:2:3:4na:b:c:dn' | awk '{sub(/:/, "-")} 1'
1-2:3:4
a-b:c:d

# for every enter line, change all ':' to '-'
# identical as: sed 's/:/-/g'
$ printf '1:2:3:4na:b:c:dn' | awk '{gsub(/:/, "-")} 1'
1-2-3-4
a-b-c-d

The primary argument to the sub and gsub capabilities is the regexp to be matched in opposition to the enter content material. The second argument is the substitute string. String literals are specified inside double quotes. Within the above examples, sub and gsub are used inside a block as they are not supposed for use as a conditional expression. The 1 after the block is handled as a conditional expression as it’s used outdoors a block. It’s also possible to use the variations introduced beneath to get the identical outcomes:

  • awk '{sub(/:/, "-")} 1' is identical as awk '{sub(/:/, "-"); print $0}'
  • It’s also possible to simply use print as a substitute of print $0 as $0 is the default string

info You would possibly surprise why to make use of or study grep and sed when you possibly can obtain the identical outcomes with awk. It relies on the issue you are attempting to resolve. A easy line filtering shall be sooner with grep in comparison with sed or awk as a result of grep is optimized for such circumstances. Equally, sed shall be sooner than awk for substitution circumstances. Additionally, not all options simply translate amongst these instruments. For instance, grep -o requires lot extra steps to code with sed or awk. Solely grep provides recursive search. And so forth. See additionally unix.stackexchange: When to use grep, sed, awk, perl, etc.

As talked about earlier than, awk is primarily used for area based mostly processing. Contemplate the pattern enter file proven beneath with fields separated by a single house character.

info The example_files listing has all of the information used within the examples.

$ cat desk.txt
brown bread mat hair 42
blue cake mug shirt -7
yellow banana window footwear 3.14

Listed here are some examples which can be based mostly on a particular area fairly than your entire line. By default, awk splits the enter line based mostly on areas and the sphere contents may be accessed utilizing $N the place N is the sphere quantity required. A particular variable NF is up to date with the overall variety of fields for every enter line. There are various extra particulars and nuances to cowl concerning the default area splitting, however for now this is sufficient to proceed.

# print the second area of every enter line
$ awk '{print $2}' desk.txt
bread
cake
banana

# print strains provided that the final area is a damaging quantity
# recall that the default motion is to print the contents of $0
$ awk '$NF&LT0' desk.txt
blue cake mug shirt -7

# change 'b' to 'B' just for the primary area
$ awk '{gsub(/b/, "B", $1)} 1' desk.txt
Brown bread mat hair 42
Blue cake mug shirt -7
yellow banana window footwear 3.14

The examples within the earlier sections have used a couple of alternative ways to assemble a typical awk one-liner. If you have not but grasped the syntax, this generic construction would possibly assist:

awk 'cond1{action1} cond2{action2} ... condN{actionN}'

When a situation is not supplied, the motion is at all times executed. Inside a block, you possibly can present a number of statements separated by the semicolon character. If an motion is not supplied, then by default, contents of $0 variable is printed if the situation evaluates to true. When motion is not current, you should utilize a semicolon to terminate a situation and begin one other condX{actionX} snippet.

Be aware that a number of blocks are only a syntactical sugar. It helps to keep away from express use of if management construction for many one-liners. The beneath snippet reveals the identical code with and with out if construction.

$ awk '{
         if($NF 

You should utilize a BEGIN{} block when it is advisable execute one thing earlier than the enter is learn and an END{} block to execute one thing after all the enter has been processed.

$ seq 2 | awk 'BEGIN{print "---"} 1; END{print "%%%"}'
---
1
2
%%%

There are some extra kinds of blocks that can be utilized, you will see them in coming chapters. See gawk manual: Operators for particulars about operators and gawk manual: Truth Values and Conditions for conditional expressions.

Some examples up to now have already used string and numeric literals. As talked about earlier, awk tries to offer a concise approach to assemble an answer from the command line. The info sort of a worth is decided based mostly on the syntax used. String literals are represented inside double quotes. Numbers may be integers or floating-point. Scientific notation is allowed as nicely. See gawk manual: Constant Expressions for extra particulars.

# BEGIN{} can be helpful to put in writing an awk program with none exterior enter
$ awk 'BEGIN{print "hello"}'
hello

$ awk 'BEGIN{print 42}'
42
$ awk 'BEGIN{print 3.14}'
3.14
$ awk 'BEGIN{print 34.23e4}'
342300

It’s also possible to save these literals in variables for later use. Some variables are predefined, NF for instance.

$ awk 'BEGIN{a=5; b=2.5; print a+b}'
7.5

# strings positioned subsequent to one another are concatenated
$ awk 'BEGIN{s1="con"; s2="cat"; print s1 s2}'
concat

If an uninitialized variable is used, it is going to act as an empty string in string context and 0 in numeric context. You’ll be able to pressure a string to behave as a quantity by merely utilizing it in an expression with numeric values. It’s also possible to use unary + or - operators. If the string does not begin with a sound quantity (ignoring any beginning whitespaces), it is going to be handled as 0. Equally, concatenating a string to a quantity will routinely change the quantity to string. See gawk manual: How awk Converts Between Strings and Numbers for extra particulars.

# identical as: awk 'BEGIN{sum=0} {sum += $NF} END{print sum}'
$ awk '{sum += $NF} END{print sum}' desk.txt
38.14

$ awk 'BEGIN{n1="5.0"; n2=5; if(n1==n2) print "equal"}'
$ awk 'BEGIN{n1="5.0"; n2=5; if(+n1==n2) print "equal"}'
equal
$ awk 'BEGIN{n1="5.0"; n2=5; if(n1==n2".0") print "equal"}'
equal

$ awk 'BEGIN{print 5 + "abc 2 xyz"}'
5
$ awk 'BEGIN{print 5 + " t 2 xyz"}'
7

Arrays in awk are associative, that means they’re key-value pairs. The keys may be numbers or strings, however numbers get transformed to strings internally. They are often multi-dimensional as nicely. There shall be loads of array examples in later chapters in related context. See gawk manual: Arrays for full particulars and gotchas.

# assigning an array and accessing a component based mostly on string keys
$ awk 'BEGIN{scholar["id"] = 101; scholar["name"] = "Joe";
       print scholar["name"]}'
Joe

# checking if a key exists
$ awk 'BEGIN{scholar["id"] = 101; scholar["name"] = "Joe";
       if("id" in scholar) print "Key discovered"}'
Key discovered

In my early days of getting used to the Linux command line, I used to be intimidated by sed and awk examples and did not even attempt to study them. Hopefully, this gentler introduction works for you and the varied syntactical magic has been defined adequately. Attempt to experiment with the given examples, for instance change area numbers to one thing aside from the quantity used. Be curious, like what occurs if a area quantity is damaging or a floating-point quantity. Learn the guide. Follow loads. And so forth.

The subsequent chapter is devoted solely for normal expressions. The options launched on this chapter can be used within the examples, so be sure you are comfy with awk syntax earlier than continuing. Fixing the workouts to comply with will assist check your understanding.

I wrote a TUI app that will help you clear up a number of the workouts from this e book interactively. See AwkExercises repo for set up steps and app_guide.md for directions on utilizing this app.

See Also

This is a pattern screenshot:

AwkExercises example

info All of the workouts are additionally collated collectively in a single place at Exercises.md. For options, see Exercise_solutions.md.

info The exercises listing has all of the information used on this part.

1) For the enter file addr.txt, show all strains containing is.

$ cat addr.txt
Hiya World
How are you
This recreation is nice
At present is sunny
12345
You might be humorous

$ awk ##### add your answer right here
This recreation is nice
At present is sunny

2) For the enter file addr.txt, show the primary area of strains not containing y. Contemplate house as the sphere separator for this file.

$ awk ##### add your answer right here
Hiya
This
12345

3) For the enter file addr.txt, show all strains containing not more than 2 fields.

$ awk ##### add your answer right here
Hiya World
12345

4) For the enter file addr.txt, show all strains containing is within the second area.

$ awk ##### add your answer right here
At present is sunny

5) For every line of the enter file addr.txt, change the primary incidence of o with 0.

$ awk ##### add your answer right here
Hell0 World
H0w are you
This recreation is g0od
T0day is sunny
12345
Y0u are humorous

6) For the enter file desk.txt, calculate and show the product of numbers within the final area of every line. Contemplate house as the sphere separator for this file.

$ cat desk.txt
brown bread mat hair 42
blue cake mug shirt -7
yellow banana window footwear 3.14

$ awk ##### add your answer right here
-923.16

7) Append . to all of the enter strains for the given stdin information.

$ printf 'lastnappendnstopntailn' | awk ##### add your answer right here
final.
append.
cease.
tail.

8) Exchange all occurrences of 0xA0 with 0x50 and 0xFF with 0x7F for the given enter file.

$ cat hex.txt
begin tackle: 0xA0, func1 tackle: 0xA0
finish tackle: 0xFF, func2 tackle: 0xB0

$ awk ##### add your answer right here
begin tackle: 0x50, func1 tackle: 0x50
finish tackle: 0x7F, func2 tackle: 0xB0

Source Link

What's Your Reaction?
Excited
0
Happy
0
In Love
0
Not Sure
0
Silly
0
View Comments (0)

Leave a Reply

Your email address will not be published.

2022 Blinking Robots.
WordPress by Doejo

Scroll To Top