The major work of every System Administrator is dealing with texts. Unix philosophy summarized that, the universal interface is writing programs to handle text streams. So Linux is inspired by the Unix operating system and it firmly adopts its philosophy. System Administrator must expect lots of text manipulation tools within a Linux distribution that allow you to gather data chunks or file statistics for troubleshooting or analysis purposes. This is a continuation of LPIC 1 series.
In this guide,we are going to look at;
- How to process text using streams, redirection and pipes
- How to process text using filters
- How to process texts using regulars expressions
Processing Text Using Streams, Redirection and Pipes
Redirection
Text manipulation programs will get text from;
- Standard output (stdout).
- standard error output (stderr).
- Standard input (stdin).
Redirection operators
>:
Redirect STDOUT to specified file. If file exists, overwrite it. If it does not exist, create it.
>>:
Redirect STDOUT to specified file. If file exists, append to it. If it does not exist, create it.
2>:
Redirect STDERR to specified file. If file exists, overwrite it. If it does not exist, create it.
2>>:
Redirect STDERR to specified file. If file exists, append to it. If it does not exist, create it.
&>:
Redirect STDOUT and STDERR to specified file. If file exists, overwrite it. If it does not exist, create it.
&>>:
Redirect STDOUT and STDERR to specified file. If file exists, append to it. If it does not exist, create it.
<:
Redirect STDIN from specified file into command.
<>:
Redirect STDIN from specified file into command and redirect STDOUT to specified file.
Redirecting Standard Output
It is important to know that Linux treats every object as a file
. This includes the output process, such as displaying a text file on the screen. File objects are identified using file descriptor
, this is an integer that classifies a process’s open files.
Standard output
(STDOUT) has a file descriptor of 1
.
echo
command display text to STDOUT in the terminal screen i.e
$ echo "The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it."
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
We can redirect the output above using redirection operators
(>
) to a file in the command line i.e
echo "The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it." > LinuxLove.txt
Now, let’s check what is in the file LinuxLove.txt using cat
command;
$ cat LinuxLove.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
We find that, LinuxLove.txt file contains the output which has been send via >
operator.
NOTE: If you use the >
redirection operator and send the output to a file that already exists, that file’s current data will be deleted. Use caution when employing this operator.
Use >>
operator to append data to preexisting file i.e
echo "And also Linux is only free if your time has no value." >> LinuxLove.txt
Employ cat
command to check LinuxLove.txt file;
$ cat LinuxLove.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
We find that some texts have been appended to LinuxLove.txt file.
Redirecting Standard Error
Standard error, its acronym is STDERR
. It has a file descriptor of 2
. The basic redirection operator to send STDERR
to a file is the 2>
operator. To append the file, use the 2>>
operator.
Let’s run the following command and check the output;
$ grep -d skip hosts: /etc/*
grep: /etc/brlapi.key: Permission denied
grep: /etc/gshadow: Permission denied
grep: /etc/gshadow-: Permission denied
/etc/nsswitch.conf:hosts: files mdns4_minimal [NOTFOUND=return] dns mymachines
grep: /etc/shadow: Permission denied
grep: /etc/shadow-: Permission denied
grep: /etc/sudoers: Permission denied
Since the user does not have super user privileges we get a lot of error messages “permission denied” when we run the command.
Now, let’s redirect the errors to a file called Errors.txt;
$ grep -d skip hosts: /etc/* 2> Errors.txt
/etc/nsswitch.conf:hosts: files mdns4_minimal [NOTFOUND=return] dns mymachines
In the above output, we find that it’s easier to read the contents of hosts files.
We can review the error messages using cat
command;
$ cat Errors.txt
grep: /etc/brlapi.key: Permission denied
grep: /etc/gshadow: Permission denied
grep: /etc/gshadow-: Permission denied
grep: /etc/shadow: Permission denied
grep: /etc/shadow-: Permission denied
grep: /etc/sudoers: Permission denied
If you don’t want to retain the error messages, you can redirect the STDERR
to /dev/null
(black hole) file i.e
$ grep -d skip hosts: /etc/* 2> /dev/null
/etc/nsswitch.conf:hosts: files mdns4_minimal [NOTFOUND=return] dns mymachines
Redirecting Standard Input
Standard input, STDIN
has a file descriptor of 0
. STDIN comes into Linux system through the keyboard or any other input devices. The basic redirection operator is the <
symbol.
Let’s use cat
command to take input from LinuxLove.txt file;
$ cat < LinuxLove.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
cat
command receive input from LinuxLove.txt
file therefore, we see the contents of the file.
Pipes
The pipe is a simple redirection operator represented by the ASCII character 124 (|
), which is called the vertical bar, vertical slash, or vertical line. Unlike redirects, with pipes data flows from left to right in the command line and the target is another process. You can redirect STDOUT, STDIN, and STDERR between multiple commands all on one command line.
Using Pipes
Syntax:
COMMAND1 | COMMAND2 [| COMMANDN]…
According to the syntax, first command, COMMAND1, is executed and its STDOUT is redirected as STDIN into the second command, COMMAND2.
$ cat /etc/passwd | grep frank
frank:x:1000:1000:frank_bett,,,:/home/frank:/usr/bin/zsh
In the above output, the contents of of /etc/passwd
file produced by cat /etc/passwd
was piped to the command grep 'frank'
, which then selects only the lines containing the term frank
.
Streams
Stream editor, sed
modifies text that is passed to it via a file or output from a pipeline. The stream editor can be used to find matching occurrences of strings using Regular Expressions as well as editing files using pre-defined patterns.
Using Sed
Syntax:
sed [OPTIONS] [SCRIPT]… [FILENAME]
sed
utility is used to modify STDIN
text:
$ echo "I love Linux." | sed 's/Linux/Ubuntu/'
I love Ubuntu.
The sed
utility’s s
command (substitute) specifies that if the first text string, Linux
, is found, it is changed to Ubuntu
in the output.
sed
utility is used to delete file text:
Let’s check the contents of LinuxLove.txt file using cat
command:
$ cat LinuxLove.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
Now, let’s use sed
to delete some texts in the above file.
$ sed '/value/d' LinuxLove.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
We notice that LinuxLove.txt file line that contains the word value
is not displayed to STDOUT. It was “deleted” in the output, but it still exists within the text file.
Processing text using filters
At times, you need to employ tools that allow you to gather data chunks or file statistics for troubleshooting or analysis purposes.
Processing text using filters commands are classified into five;
- File-Combining Commands
- File-Transforming Commands
- File-Formatting Commands
- File-Viewing Commands
- File-Summarizing Commands
File-Combining Commands
Viewing, comparing and putting together short text files in your screen is very important. File-Combining Commands can accomplish this task as follows.
cat Command
Concatenate, acronym cat
command is used to combine or read (view) plain text files.
Syntax:
cat [OPTION]… [FILE]…
Using cat
to view a file:
$ cat LinuxLove.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
Example 2:
$ cat Errors.txt
grep: /etc/brlapi.key: Permission denied
grep: /etc/gshadow: Permission denied
grep: /etc/gshadow-: Permission denied
grep: /etc/shadow: Permission denied
grep: /etc/shadow-: Permission denied
grep: /etc/sudoers: Permission denied
Using cat
command to concatenate files:
$ cat LinuxLove.txt Errors.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
I love Linux.
grep: /etc/brlapi.key: Permission denied
grep: /etc/gshadow: Permission denied
grep: /etc/gshadow-: Permission denied
grep: /etc/shadow: Permission denied
grep: /etc/shadow-: Permission denied
grep: /etc/sudoers: Permission denied
In the above output, we have combined both LinuxLove.txt with Errors.txt.
bzcat Command
Used for the processing or reading of files compressed using the bzip2
method i.e
Let’s compress LinuxLove.txt file using bzip2
command:
bzip2 -z LinuxLove.txt
Using ls -lh
command, check our compressed file:
$ ls -lh
total 8.0K
-rw-rw-r-- 1 frank frank 231 Apr 8 12:31 Errors.txt
-rw-rw-r-- 1 frank frank 165 Apr 8 12:29 LinuxLove.txt.bz2
Now, we can read the LinuxLove.txt.bz2 file using bzcat
command:
$ bzcat LinuxLove.txt.bz2
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
I love Linux.
zcat Command
Used for the processing or reading of files compressed using the gzip
method.
Let’s compress LinuxLove.txt file using gzip
command:
gzip LinuxLove.txt
Using ls -lh
command, check our compressed file:
$ ls -lh
total 8.0K
-rw-rw-r-- 1 frank frank 231 Apr 8 12:31 Errors.txt
-rw-rw-r-- 1 frank frank 166 Apr 8 12:29 LinuxLove.txt.gz
Now, we can read the LinuxLove.txt.gz file using zcat
command:
$ zcat LinuxLove.txt.gz
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
I love Linux.
xzcat Command
Used for the processing or reading of files compressed using the xz
method.
Let’s compress LinuxLove.txt file using xz
command:
xz LinuxLove.txt
Using ls -lh
command, check our compressed file:
$ ls -lh
total 8.0K
-rw-rw-r-- 1 frank frank 231 Apr 8 12:31 Errors.txt
-rw-rw-r-- 1 frank frank 216 Apr 8 12:29 LinuxLove.txt.xz
Now, we can read the LinuxLove.txt.xz file using xzcat
command:
$ xzcat LinuxLove.txt.xz
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
I love Linux.
paste Command
Used to join or display two files side-by-side. paste
command glue the files together but the result will not necessarily be pretty.
Using cat
command let’s check the contents of these files:
Example1: Debian.txt
$ cat Debian.txt
Ubuntu
Debian
Example2: Redhat.txt
$ cat Redhat.txt
Centos
Fedora
Now let’s use paste
command to join together the above files:
$ paste Debian.txt Redhat.txt
Ubuntu Centos
Debian Fedora
File-Transforming Commands
Looking at a file’s data in different ways is helpful not only in troubleshooting but in testing as well. We’ll take a look at a few helpful file-transforming commands in this section.
od Command
The octal dump, od
command is used to display a binary file in either octal, decimal, or hexadecimal notation.
Syntax:
od [OPTION]… [FILE]…
od
displays a file’s text in octal by default, let’s check the contents of LinuxLove.txt using cat
command:
$ cat LinuxLove.txt
The Linux philosophy is 'Laugh in the face of danger'. Oops. Wrong One. 'Do it yourself'. Yes, that's it.
And also Linux is only free if your time has no value.
I love Linux.
Using the od
command to display a file’s text in octal:
$ od LinuxLove.txt
0000000 064124 020145 064514 072556 020170 064160 066151 071557
0000020 070157 074550 064440 020163 046047 072541 064147 064440
0000040 020156 064164 020145 060546 062543 067440 020146 060544
0000060 063556 071145 027047 047440 070157 027163 053440 067562
0000100 063556 047440 062556 020056 042047 020157 072151 074440
0000120 072557 071562 066145 023546 020056 062531 026163 072040
0000140 060550 023564 020163 072151 005056 067101 020144 066141
0000160 067563 046040 067151 074165 064440 020163 067157 074554
0000200 063040 062562 020145 063151 074440 072557 020162 064564
0000220 062555 064040 071541 067040 020157 060566 072554 027145
0000240 044412 066040 073157 020145 064514 072556 027170 000012
0000257
split Command
This command can split larger files into smaller ones depending on the criteria set by the command’s options.
Syntax:
split [OPTION]… [INPUT [PREFIX]]
Check the contents of Errors.txt file with cat
command.
$ cat Errors.txt
grep: /etc/brlapi.key: Permission denied
grep: /etc/gshadow: Permission denied
grep: /etc/gshadow-: Permission denied
grep: /etc/shadow: Permission denied
grep: /etc/shadow-: Permission denied
grep: /etc/sudoers: Permission denied
Using the split -l
command to split Errors.txt file by line count:
$ split -l 3 Errors.txt split
Using ls -lh
command let’s check the file obtained from split
command:
$ ls -lh split*
-rw-rw-r-- 1 frank frank 118 Apr 8 16:32 splitaa
-rw-rw-r-- 1 frank frank 113 Apr 8 16:32 splitab
Check the contents of each file using cat
command:
splitaa file
$ cat splitaa
grep: /etc/brlapi.key: Permission denied
grep: /etc/gshadow: Permission denied
grep: /etc/gshadow-: Permission denied
splitab file
$ cat splitab
grep: /etc/shadow: Permission denied
grep: /etc/shadow-: Permission denied
grep: /etc/sudoers: Permission denied
To split a file by its line count, you need to employ the -l
(lowercase L) option and provide the number of text file lines to attempt to put into each new file. In the above example, the original file has six text lines, so one new file (splitaa) gets the first three lines of the original file, and the second new file (splitab) has the last three lines also.
File-Formatting Commands
Often to understand the data within text files, you need to reformat the data in some way. There are a couple of simple utilities you can use to do this.
sort Command
The sort
command is used for the organizing the output of a listing alphabetically, reverse alphabetically, or in a random order.
Syntax:
sort [OPTION]… [FILE]…
Use cat
command to check the contents of cities.txt file:
$ cat cities.txt
Berlin
London
New York
Nairobi
Moscow
Tokyo
Use the sort
command to sort cities.txt file contents in order:
$ sort cities.txt
Berlin
London
Moscow
Nairobi
New York
Tokyo
nl Command
The number line, nl
command will display the number of lines in a file as well as recreate a file with each line prep-ended by its line number.
Syntax:
nl [OPTION]… [FILE]…
Use the nl
command to add numbers cities.txt file; (non blank lines)
$ nl cities.txt
1 Berlin
2 London
3 New York
4 Nairobi
5 Moscow
6 Tokyo
If you would like all file’s lines to be numbered, including blank ones, then you’ll need to employ the -ba
switch.
File-Viewing Commands
Viewing files in CLI is a daily activity in Linux. To accomplish this tasks use the following commands.
less Command
A pager
is used to read through large text file. A pager utility allows you to view one text page at a time and move through the text at your own pace. Most commonly used pagers are more
and less
utilities but the most flexible one is less
utility.
Using less
pager:
$ less /etc/passwd
usbmux:x:107:46:usbmux daemon,,,:/var/lib/usbmux:/usr/sbin/nologin
rtkit:x:108:114:RealtimeKit,,,:/proc:/usr/sbin/nologin
dnsmasq:x:109:65534:dnsmasq,,,:/var/lib/misc:/usr/sbin/nologin
cups-pk-helper:x:110:117:user for cups-pk-helper service,,,:/home/cups-pk-helper:/usr/sbin/nologin
speech-dispatcher:x:111:29:Speech Dispatcher,,,:/run/speech-dispatcher:/bin/false
avahi:x:112:118:Avahi mDNS daemon,,,:/var/run/avahi-daemon:/usr/sbin/nologin
kernoops:x:113:65534:Kernel Oops Tracking Daemon,,,:/:/usr/sbin/nologin
saned:x:114:120::/var/lib/saned:/usr/sbin/nologin
nm-openvpn:x:115:121:NetworkManager OpenVPN,,,:/var/lib/openvpn/chroot:/usr/sbin/nologin
hplip:x:116:7:HPLIP system user,,,:/run/hplip:/bin/false
whoopsie:x:117:122::/nonexistent:/bin/false
This command paginates the contents of a file, and allows for navigation and search functionality. To exit less
pager use Q
key.
head Command
The head
command display the first 10
lines of a file by default. With the use of the -n
switch fewer or more lines can be displayed.
Syntax:
head [OPTION]… [FILE]…
Using head
command example:
$ head /etc/passwd
root:x:0:0:root:/root:/bin/bash
daemon:x:1:1:daemon:/usr/sbin:/usr/sbin/nologin
bin:x:2:2:bin:/bin:/usr/sbin/nologin
sys:x:3:3:sys:/dev:/usr/sbin/nologin
sync:x:4:65534:sync:/bin:/bin/sync
games:x:5:60:games:/usr/games:/usr/sbin/nologin
man:x:6:12:man:/var/cache/man:/usr/sbin/nologin
lp:x:7:7:lp:/var/spool/lpd:/usr/sbin/nologin
mail:x:8:8:mail:/var/mail:/usr/sbin/nologin
news:x:9:9:news:/var/spool/news:/usr/sbin/nologin
Display fewer lines with head
command:
$ head -n 4 /etc/passwd
root:x:0:0:root:/root:/bin/bash
daemon:x:1:1:daemon:/usr/sbin:/usr/sbin/nologin
bin:x:2:2:bin:/bin:/usr/sbin/nologin
sys:x:3:3:sys:/dev:/usr/sbin/nologin
tail Command
The tail
command display the last 10
lines of a file by default. With the use of the -n
switch fewer or more lines can be displayed. The -f
option is used to follow the output of a text file has new data is being written to it.
Syntax:
tail [OPTION]… [FILE]…
Using tail
command example:
$ tail /etc/passwd
systemd-resolve:x:998:998:systemd Resolver:/:/usr/sbin/nologin
systemd-timesync:x:997:997:systemd Time Synchronization:/:/usr/sbin/nologin
systemd-coredump:x:996:996:systemd Core Dumper:/:/usr/sbin/nologin
sshd:x:123:65534::/run/sshd:/usr/sbin/nologin
libvirt-qemu:x:64055:105:Libvirt Qemu,,,:/var/lib/libvirt:/usr/sbin/nologin
libvirt-dnsmasq:x:124:136:Libvirt Dnsmasq,,,:/var/lib/libvirt/dnsmasq:/usr/sbin/nologin
mysql:x:995:1001::/home/mysql:/bin/sh
pilot:x:1001:1002:,,,:/home/pilot:/bin/bash
_rpc:x:125:65534::/run/rpcbind:/usr/sbin/nologin
statd:x:126:65534::/var/lib/nfs:/usr/sbin/nologin
Display fewer lines with tail
command:
$ tail -n 3 /etc/passwd
pilot:x:1001:1002:,,,:/home/pilot:/bin/bash
_rpc:x:125:65534::/run/rpcbind:/usr/sbin/nologin
statd:x:126:65534::/var/lib/nfs:/usr/sbin/nologin
The most useful tail
utility features is its ability to watch log
files. Log files typically have new messages appended to the file’s bottom. Watching new messages as they are added is very handy. Use the -f
(or –follow) switch on the tail command and provide the log filename to watch as the command’s argument.
Watching a log file with the tail
command:
$ sudo tail -f /var/log/auth.log
[sudo] password for frank:
Apr 8 21:09:01 frank CRON[1825356]: pam_unix(cron:session): session opened for user root by (uid=0)
Apr 8 21:09:01 frank CRON[1825356]: pam_unix(cron:session): session closed for user root
Apr 8 21:15:01 frank CRON[1840320]: pam_unix(cron:session): session opened for user root by (uid=0)
Apr 8 21:15:01 frank CRON[1840320]: pam_unix(cron:session): session closed for user root
Apr 8 21:17:01 frank CRON[1845302]: pam_unix(cron:session): session opened for user root by (uid=0)
Apr 8 21:17:01 frank CRON[1845302]: pam_unix(cron:session): session closed for user root
Apr 8 21:21:43 frank sudo: pam_unix(sudo:auth): Couldn't open /etc/securetty: No such file or directory
Apr 8 21:21:47 frank sudo: pam_unix(sudo:auth): Couldn't open /etc/securetty: No such file or directory
Apr 8 21:21:47 frank sudo: frank : TTY=pts/0 ; PWD=/home/frank/Documents ; USER=root ; COMMAND=/usr/bin/tail -f /var/log/auth.log
Apr 8 21:21:47 frank sudo: pam_unix(sudo:session): session opened for user root by (uid=0)
To end monitoring session use Ctrl+C
.
File-Summarizing Commands
Summary information is handy to have when analyzing problems and understanding your files. Several utilities covered in this section will help you in summarizing activities.
wc Command
wc
command, its acronym word count
but depending on the parameters you use it will count characters, words and lines.
Syntax:
wc [OPTION]… [FILE]…
Using wc
command example:
$ wc cities.txt
6 7 44 cities.txt
cut Command
The cut command print columns of text files as fields based on a file’s character delimiter. It helps to quickly extract small data sections in a large text file.
Syntax:
cut OPTION… [FILE]…
Let’s use head
command to view the first four lines of /etc/passwd file:
$ head -n 4 /etc/passwd
root:x:0:0:root:/root:/bin/bash
daemon:x:1:1:daemon:/usr/sbin:/usr/sbin/nologin
bin:x:2:2:bin:/bin:/usr/sbin/nologin
sys:x:3:3:sys:/dev:/usr/sbin/nologin
Using cut
command example:
$ cut -d ":" -f 1,7 /etc/passwd
root:/bin/bash
daemon:/usr/sbin/nologin
bin:/usr/sbin/nologin
sys:/usr/sbin/nologin
This text file employs colons (:
) to delimit the fields within each record. The first use of the cut
command designates the colon delimiter using the -d
option. Notice the colon is encased in quotation marks to avoid unexpected results. The -f
option specifies that only fields 1 (username) and 7 (shell) should be displayed.
uniq Command
The uniq
command is used to list (and count) matching strings. The uniq
utility will find repeated text lines only if they come right after one another.
Let’s use cat
command to view the contents of cities.txt file:
$ cat cities.txt
Berlin
Berlin
London
Nairobi
New York
Nairobi
Moscow
Tokyo
Tokyo
Using uniq
command example:
$ uniq cities.txt
Berlin
London
Nairobi
New York
Nairobi
Moscow
Tokyo
cat
command output three sets of repeated lines in cities.txt file. That is, Berlin, Nairobi and Tokyo. Because the uniq
utility recognizes only repeated lines that are one after the other in a text file, only one of the Berlin and Tokyo text lines are removed from the display. The two Nairobi lines are still both shown.
md5sum Command
It is used for calculating the MD5 hash value of a file. Also used to verify a file against an existing hash value to ensure a file’s integrity.
Using md5sum
utility example:
$ md5sum cities.txt
38316cae26b84cab5e397d952e0dc4ea cities.txt
sha256sum Command
It is used for calculating the SHA256 hash value of a file. Also used to verify a file against an existing hash value to ensure a file’s integrity.
Using sha256sum
utility example:
$ sha256sum cities.txt
cc2c2cc587c70535f37f836c35f12c7f12efb33c10f4c1ab17f27c7ac24f6651 cities.txt
sha512sum Command
It is used for calculating the SHA512 hash value of a file. Also used to verify a file against an existing hash value to ensure a file’s integrity.
Using sha512sum
utility example:
$ sha512sum cities.txt
55e197c4fbd033faf0316bee49cd0240d0cc759a7a65d957e9076e8936c05fca0a5e98a1f2d305451d64f8d4d15ab84c273b77dd660834336d7edd97864249ec cities.txt
The sha512sum utility uses the SHA-512 algorithm, which is the best to use for security purposes and is typically employed to hash salted passwords in the /etc/shadow file on Linux.
Processing texts using regulars expressions
A regular expression is a pattern template you define for a utility such as grep, which then uses the pattern to filter text. Employing regular expressions along with text-filtering commands expands your mastery of the Linux command line.
grep Command
The grep
command is powerful in its use of regular expressions, which will help with filtering text files.
Syntax:
grep [OPTION] PATTERN [FILE…]
Using grep
command to search a file example:
$ grep frank /etc/passwd
frank:x:1000:1000:frank_bett,,,:/home/frank:/usr/bin/zsh
The grep
command returns each file record (line) that contains an instance of the PATTERN
, which in this case was the word frank
.
Conclusion
This marks the end of our guide on Processing Text Streams Using Filters on Linux. For every command used in this guide, refer to their man
pages for further usage options and understanding of those commands. Stay tuned for our next LPIC – 101 guides.
You can also check: