35,99 €
Dive into the world of Linux shell scripting with this hands-on guide. If you’re comfortable using the command line on Unix or Linux but haven’t fully explored Bash, this book is for you. It’s designed for programmers familiar with languages like Python, JavaScript, or PHP who want to make the most of shell scripting.
This isn’t just another theory-heavy book—you’ll learn by doing. Each chapter builds on the last, taking you from shell basics to writing practical scripts that solve real-world problems. With nearly a hundred interactive labs, you’ll gain hands-on experience in automation, system administration, and troubleshooting.
While Bash is the primary focus, you'll also get a look at Z Shell and PowerShell, expanding your skills and adaptability. From mastering command redirection and pipelines to writing scripts that work across different Unix-like systems, this book equips you for real-world Linux challenges.
By the end, you'll be equipped to write efficient shell scripts that streamline your workflow and improve system automation.
Das E-Book können Sie in Legimi-Apps oder einer beliebigen App lesen, die das folgende Format unterstützen:
Seitenzahl: 906
Veröffentlichungsjahr: 2024
The Ultimate Linux Shell Scripting Guide
Automate, Optimize, and Empower tasks with Linux Shell Scripting
Donald A. Tevault
The Ultimate Linux Shell Scripting Guide
Copyright © 2024 Packt Publishing
All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews.
Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing or its dealers and distributors, will be held liable for any damages caused or alleged to have been caused directly or indirectly by this book.
Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information.
Senior Publishing Product Manager: Reshma Raman
Acquisition Editor – Peer Reviews: Gaurav Gavas
Project Editor: Meenakshi Vijay
Content Development Editor: Soham Amburle
Copy Editor: Safis Editing
Technical Editors: Aneri Patel and Kushal Sharma
Proofreader: Safis Editing
Indexer: Pratik Shirodkar
Presentation Designer: Rajesh Shirsath
Developer Relations Marketing Executive: Priyadarshini Sharma
First published: October 2024
Production reference: 3050126
Published by Packt Publishing Ltd.
Grosvenor House
11 St Paul’s Square
Birmingham
B3 1RB, UK.
ISBN 978-1-83546-357-4
www.packt.com
To my loving friends and family
– Donald A. Tevault
Donald A. Tevault, but you can call him Donnie. He started with Linux in 2006, and has been working with it ever since. In that time, Donnie has created training documentation for Linux administration, bash scripting, and Nagios administration. He has served as the Linux consultant for an Internet of Things security firm, and operates the BeginLinux Guru channel on YouTube. Donnie’s other books include Mastering Linux Security and Hardening and Linux Service Management Made Easy with systemd.
I’d like to thank the team at Packt Publishing for guiding this book through to completion, and my tech reviewer Jason for his invaluable suggestions.
Jason Willson has been working in the Tech industry for over 20 years since his first job at the help desk at his alma mater, Grove City College. He was first introduced to Linux in 2007 at a startup in Boston and has worked with it professionally and personally ever since. He’s used command line and shell scripting techniques for a variety of tasks relating to Data Analysis, Systems Administration, and DevOps. He currently works as a DevOps Engineer at Carnegie Mellon University. In addition to reviewing this book, he has also reviewed another book published by Packt titled Linux Command Line and Shell Scripting Techniques by Vedran Dakic.
I’d like to thank the incredible Linkedin community for making this connection possible with Packt Publishing. I’d also like to thank all the coworkers, classmates, and mentors (personal,professional, and academic) who helped to shape me into who I am today. And last but not least,I’d like to thank my amazing wife Eva, who has been a constant support to me in reviewing this book despite such a hectic work schedule.
Read this book alongside other users, Linux experts, and the author himself.
Ask questions, provide solutions to other readers, chat with the author via Ask Me Anything sessions, and much more. Scan the QR code or visit the link to join the community.
https://packt.link/SecNet
Welcome to The Ultimate Linux Shell Scripting Guide! This book, which is ideal for both Linux beginners and more advanced Linux administrators, will guide you through the shell script creation process. We’ll begin with basic command-line usage and will progress through more advanced concepts in every succeeding chapter. You’ll see how to build scripts that can help you automate repetitive administrative tasks, as well as many other cool things. We’ll primarily concentrate on bash scripting throughout most of the book. Later, we’ll show you how to make your scripts portable so that they can run on legacy Unix systems that can’t run bash. After chapters on shell script debugging and shell script security, we’ll wrap up with introductions to the Z Shell and PowerShell.
This book is appropriate for anyone who needs to master the concepts of shell scripting. Linux beginners can benefit, because it can help them master the concepts that will be covered on the CompTIA Linux+/Linux Professional Institute exam. More advanced Linux administrators can benefit because it will show them the more advanced concepts that they need to build really useful, practical shell scripts.
Chapter 1, Getting Started with the Shell, this chapter covers the basics of operating system shells that can be found on Linux and Unix-like systems. The reader will need to know these principles in order to understand principles that will be presented in later chapters.
Chapter 2, Interpreting Commands, there are five things that an operating system shell will do for us. These include interpreting commands, setting variables, enabling pipelines, allowing input/output redirection, and allowing customization of the user’s working environment. In this chapter, we’ll look at how shells interpret a user’s commands.
Chapter 3, Understanding Variables and Pipelines, in this chapter, we’ll look at the next two things that an operating system shell does for us, which is to allow us to set variables and use command pipelines. There’s not that much to say about either of these topics, which is why we’re combining them both into one chapter.
Chapter 4, Understanding Input/Output Redirection, in this chapter, we’ll look at how to send the text output of a command to somewhere other than the terminal, which is the default output device. We’ll then look at how to make a command bring in text from somewhere other than the keyboard, which is the default input device.
Finally, we’ll look at how to send error messages to somewhere other than the terminal.
Chapter 5, Customizing the Environment, in this chapter, we’ll look at the various configuration files for the various shell environments. We’ll look at how to customize these configuration files, and how to set certain environmental options from the command-line.
Chapter 6, Text Stream Filters – Part 1, many times, an administrator will need to write a shell script that will retrieve text information from an external source, format it, and create a report. In this chapter, we’ll introduce the concept of text stream filters, which can help with this process. Also, knowing about these text stream filters can help you pass certain Linux certification exams, such as the LPI/Linux+ exam. We will then show you how to use several of these filters.
Chapter 7, Text Stream Filters – Part 2, in this chapter, we’ll continue our exploration of text stream filters.
Chapter 8, Basic Shell Script Construction, in this chapter, we’ll explain about the basic structure of a shell script, and will use some of the text stream filters from the previous chapters to create simple scripts. We’ll also look at some basic programming constructs that are common to all programming languages, and show you how to use them.
Chapter 9, Filtering Text with grep, sed, and Regular Expressions, in this chapter, you’ll learn about the concept of regular expressions, and how to use them with grep and sed to filter or manipulate text. These techniques can not only help you find certain text, but can also help automate the creation of reports and the editing of multiple text files at once.
Chapter 10, Understanding Functions, functions are an important part of every programming language, because they make it easy for a programmer to reuse a block of code in numerous programs, or in numerous places within one single program. The programmer can pass parameters to a function, have the function operate on those parameters, and pass back the results to the main program.
Chapter 11, Performing Mathematical Operations, the various operating system shells all have means of performing mathematical operations either from the command-line, or from within a shell script. In this chapter, we’ll look at how to perform operations with both integer and floating point math.
Chapter 12, Automating Scripts with here Documents and expect, although it’s easy to have a shell script pull data out of a separate text file, it’s sometimes handier to store the data within the shell script itself. We’ll do that using a “here” document. In this chapter, you’ll learn how to create and use “here” documents. You’ll also see how to automate certain scripts with the expect utility.
Chapter 13, Scripting with ImageMagick, imageMagick is a text-mode program that is used to edit, manipulate, and view graphical image files. In this chapter, you’ll learn how to automate the processing of images by using ImageMagick commands within shell scripts.
Chapter 14, Using awk–Part 1, this chapter covers awk, which is a tool that can extract specific text from text files, and automate the creation of reports and databases. Since awk is a full-blown programming language in its own right, we won’t be covering it in depth here. Instead, we’ll give you enough information so that you can create awk “one-liners” that can be used within shell scripts.
Chapter 15, Using awk–Part 2, this is a continuation of the previous chapter, in which we’ll cover the more advanced concepts of scripting with awk.
Chapter 16, Creating User Interfaces with yad, dialog, and xdialog, so far, we’ve only looked at shell scripts that run strictly from the command-line. And indeed, that’s how most people use them, and is what most people think about when they think about shell scripts. But, it’s also possible to create shell scripts that offer a user interface. In this chapter, we’ll use yad to create graphical user interfaces, and dialog to create ncurses-style interfaces.
Chapter 17, Using Shell Script Options with getopts, often, an administrator will need to pass both arguments and options to a shell script. Passing arguments, the objects upon which a script will operate, is easy. To also pass options, which modify how the script will operate, requires another type of operator. In this chapter, you’ll learn how to use getopts to pass options to a script.
Chapter 18, Shell Scripting for Security Professionals, in this chapter, you’ll learn how to either create shell scripts or search for existing shell scripts that can help security administrators perform their jobs. We’ll also look at how to modify or improve existing shell scripts to meet specific needs of security administrators.
Chapter 19, Shell Script Portability, large organizations, such as large government agencies or large corporations, might have a diverse mix of Linux, Unix, and Unix-like machines. Sometimes, it’s handy to write shell scripts that can automatically detect the type of system on which they’re running, and run the appropriate code for each type of system. In this chapter, we’ll look at several methods for enhancing script portability.
Chapter 20, Shell Script Security, scripting errors can cause a script to inadvertently cause the exposure of sensitive data, or to allow someone to perform unauthorized activities on a system. In this chapter, we’ll look at ways to help the reader write shell scripts that are as secure as they possibly can be.
Chapter 21, Debugging Shell Scripts, shell scripts can have bugs, the same as with any other programming language. Sometimes, the bugs are easy to find, and sometimes they’re not. In this chapter, we’ll look at various methods that can help a busy administrator debug shell scripts that aren’t working properly.
Chapter 22, Introduction to Z Shell Scripting, the Z Shell, or zsh, is an alternate shell that can be used in place of bash. It’s mainly used in the same manner as bash, but it also has enhancements that bash doesn’t have. In this chapter, we’ll look at these enhancements, and also at some scripting tricks that you can’t do with bash.
Chapter 23, Using PowerShell on Linux, powerShell was created by Microsoft for use on Windows operating systems back in 2006. In 2016, Microsoft announced that they had open-sourced PowerShell, and were making it available for Linux and macOS, as well as for Windows. In this chapter, we’ll look at how PowerShell can be beneficial for Linux administrators, how to install it, and how to use it.
Since the book begins with the very basics of Linux and Unix command-line usage, the reader really just needs to be comfortable with the idea of setting up VirtualBox and installing Linux, FreeBSD, and OpenIndiana virtual machines.
VirtualBox is a free download that you can get from here: https://www.virtualbox.org/
To run VirtualBox, you’ll need a machine with a CPU that is capable of virtualization. Most modern CPUs have that capability, with the exception of certain Intel Core i3 and Core i5 models. (That’s because they lack the hardware acceleration that’s required for virtualization.) Also, you’ll have to ensure that virtualization is enabled in your computer’s BIOS.
For the demos, we’ll be using Fedora, Debian, Ubuntu, FreeBSD, and OpenIndiana virtual machines. Here’s where you can download the installation images:
Fedora: https://fedoraproject.org/Debian: https://www.debian.org/Ubuntu: https://ubuntu.com/FreeBSD: https://www.freebsd.org/OpenIndiana: https://openindiana.org/In all cases, you’ll need to create a normal user account that has full sudo privileges. That happens automatically with Ubuntu and OpenIndiana during installation. With Debian and Fedora, that will happen automatically if you omit creating a root user password during installation.
For FreeBSD, things are a bit different. That’s because the FreeBSD installer will have you create a password for the root user, and sudo won’t be installed. So, here’s the procedure for installing FreeBSD.
When you get to the installer section that has you create your own user account, you’ll see:Login group is your_username. Invite your_username into other groups.Respond by typing wheel, in order to add yourself to the wheel group.After the installation has completed, log into the root user account, using the password that you created during installation.Install the sudo package by doing: pkg install sudo Configure sudo so that members of the wheel group have full sudo privileges. Begin by entering the command: visudo Scroll down to where you see this line: # %wheel ALL=(ALL:ALL) ALL Remove the # and the leading blank space from in front of this line.Save the file and exit.Log out from the root user's account, and log back in with your own account.When you need to perform an administrative command, you can now use sudo, as you would on any Linux distro.
Next, you’ll need to install bash on FreeBSD.
Since bash doesn’t come installed on FreeBSD by default, you’ll need to install it yourself. Here’s the procedure:
Install bash with this command: sudo pkg install bash Create a symbolic line to the bash executable, like this: sudo ln -s /usr/local/bin/bash /bin/bashThe code bundle for the book is hosted on GitHub at https://github.com/PacktPublishing/The-Ultimate-Linux-Shell-Scripting-Guide.git. We also have other code bundles from our rich catalog of books and videos available at https://github.com/PacktPublishing/. Check them out!
We also provide a PDF file that has color images of the screenshots/diagrams used in this book. You can download it here: https://packt.link/gbp/9781835463574.
There are a number of text conventions used throughout this book.
CodeInText: Indicates code words in text, database table names, folder names, filenames, file extensions, pathnames, dummy URLs, user input, and Twitter handles. For example: “Add the new functions to the /etc/bashrc file.”
donnie@opensuse:~> git clone https://github.com/PacktPublishing/The-Ultimate-Linux-Shell-Scripting-Guide.gitBold: Indicates a new term, an important word, or words that you see on the screen. For instance, words in menus or dialog boxes appear in the text like this. For example: “ First, let’s see how many processes are in either the Running state or the Zombie state.”
Warnings or important notes appear like this.
Tips and tricks appear like this.
Feedback from our readers is always welcome.
General feedback: Email [email protected] and mention the book’s title in the subject of your message. If you have questions about any aspect of this book, please email us at [email protected].
Errata: Although we have taken every care to ensure the accuracy of our content, mistakes do happen. If you have found a mistake in this book, we would be grateful if you reported this to us. Please visit http://www.packtpub.com/submit-errata, click Submit Errata, and fill in the form.
Piracy: If you come across any illegal copies of our works in any form on the internet, we would be grateful if you would provide us with the location address or website name. Please contact us at [email protected] with a link to the material.
If you are interested in becoming an author: If there is a topic that you have expertise in and you are interested in either writing or contributing to a book, please visit http://authors.packtpub.com.
Once you’ve read The Ultimate Linux Shell Scripting Guide, First Edition, we’d love to hear your thoughts! Please click here to go straight to the Amazon review page for this book and share your feedback.
Your review is important to us and the tech community and will help us make sure we’re delivering excellent quality content.
This book comes with free benefits to support your learning. Activate them now for instant access (see the “How to Unlock” section for instructions).
Here’s a quick overview of what you can instantly unlock with your purchase:
PDF and ePub Copies
Next-Gen Web-Based Reader
Access a DRM-free PDF copy of this book to read anywhere, on any device.
Multi-device progress sync: Pick up where you left off, on any device.
Use a DRM-free ePub version with your favorite e-reader.
Highlighting and notetaking: Capture ideas and turn reading into lasting knowledge.
Bookmarking: Save and revisit key sections whenever you need them.
Dark mode: Reduce eye strain by switching to dark or sepia themes.
Scan the QR code (or go to packtpub.com/unlock). Search for this book by name, confirm the edition, and then follow the steps on the page.
Note: Keep your invoice handy. Purchases made directly from Packt don’t require one
CloudPro is a weekly newsletter for cloud professionals who want to stay current on the fast-evolving world of cloud computing, DevOps, and infrastructure engineering.
Every issue delivers focused, high-signal content on topics like:
AWS, GCP & multi-cloud architectureContainers, Kubernetes & orchestrationInfrastructure as Code (IaC) with Terraform, Pulumi, etc.Platform engineering & automation workflowsObservability, performance tuning, and reliability best practicesWhether you’re a cloud engineer, SRE, DevOps practitioner, or platform lead, CloudPro helps you stay on top of what matters, without the noise.
Scan the QR code to join for free and get weekly insights straight to your inbox:
https://packt.link/cloudpro
Before we can talk about shell scripting, we need to know what a shell is and what kinds of shells are available for Linux, Unix, and Unix-like operating systems. We’ll also talk about other important topics that will help get you started in the wide, wonderful world of shell scripting.
Topics in this chapter include:
Understanding shellsFinding help with shell commandsUsing a text editorUnderstanding compiled versus interpreted programmingUnderstanding root and sudo privilegesIf you’re ready, let’s get started on this important journey. And, always remember to have some fun along the way.
Your purchase includes a free PDF copy of this book along with other exclusive benefits. Check the Free Benefits with Your Book section in the Preface to unlock them instantly and maximize your learning experience.
So, you’re scratching your head and saying, “What is a shell, and why should I care?” Well, a shell is a program that acts as an intermediary between the user and the operating system kernel. A user types commands into the shell, which passes them into the kernel for processing. The output is then presented to the user via the computer terminal, which can also referred to as the screen. The most common shell on Linux systems is bash, but the Z shell (zsh) has been gaining popularity in recent years. (I’ll explain why in Chapter 22, Using the Z Shell.) You’ll find bash as the default shell on most Linux distros and certain Unix-like distros such as OpenIndiana, and zsh as the default on Kali Linux.
If you are brand new to the wild, wonderful world of Linux and its Unix or Unix-like cousins, you might be wondering what a distro is. Well, unlike Windows and macOS, which are proprietary and controlled by a single company, Linux and its cousins are primarily open source software, which means that anyone can take the source code and create their own implementations, or distributions. Red Hat Enterprise Linux, Fedora, and Ubuntu are examples of Linux distributions, and OpenIndiana and FreeBSD are examples of Unix-like distributions. But, we hard-core geeks rarely utter the word distribution, and instead just say distro, for short.
Also, the reason that I differentiate between Unix and Unix-like distros has to do with legal reasons that date back to the 1980s. This involves a rather complicated mess that I would rather not go into here. Suffice it to say that the creators of distros such as FreeBSD are not allowed to refer to their creations as Unix, even though they are mostly functionally equivalent. But, they can say that their creations are Unix-like.
The newest versions of macOS also have zsh set as the default shell. Fortunately, much of what you’ll learn about bash also works on zsh. The main difference is that zsh has a few cool features that bash doesn’t have. (Again, I’ll explain all about that in Chapter 22.) PowerShell, which originally was only available for Microsoft Windows operating systems, has also been available for Linux and macOS since 2016. PowerShell is a whole different animal, but you might find it quite useful, as you should see when we get to Chapter 23, Using PowerShell on Linux.
It’s common to hear people refer to bash as the bash shell. But, bash is short for Bourne Again Shell. So, when you say bash shell, you’re really saying Bourne Again Shell Shell, which is a bit awkward. This is the same as when people talk about going to the ATM machine to withdraw some money. What they’re really saying is that they’re going to the Automatic Teller Machine Machine, which is also awkward.
And, don’t even get me started on the people who talk about hot water heaters. I mean, if the water is already hot, why heat it?
On the other hand, if you find that you still need to say bash shell so that people will know what you’re talking about, I’ll understand and won’t condemn you for it. In fact, you might even see me do that on occasion.
The coolest thing about modern operating system shells is that they’re much more than just an interface tool. They’re also full-blown programming environments with many of the same programming constructs as more complex programming languages, such as Pascal, C, or Java. Systems administrators can make their jobs much easier by using shell scripts to automate complex, repetitive tasks.
When you log into a text-mode Linux or Unix server, you’ll be presented with a black screen and some text, which looks like this:
Figure 1.1: Plain bash on a text-mode Debian Linux machine
This is the unadorned, plain-jane shell. Machines with desktop environments installed will interface with the shell via a terminal emulator, which will look something like this:
Figure 1.2: A terminal emulator that interfaces with bash on an OpenIndiana machine
The name of the terminal emulator will differ from one desktop environment to the next, but all do the same job. The advantage of using a terminal emulator is that you’ll have the luxury of using scroll bars, customizing the display, and using copy-and-paste for the command-line.
In any case, you can see which shell you’re using by typing:
donnie@fedora:~$ echo $SHELL /bin/bash donnie@fedora:~$In this case, you see that you’re using bash.
It doesn’t matter how much of an expert you think you are, there will still be times when you’ll need to look up some bit of information. With Linux, Unix, and Unix-like operating systems, there are several options for that.
Manual pages, or man pages for short, have been built into Unix-like operating systems since almost forever. To use a man page, just enter man, followed by the name of the command, configuration file, or system component for which you seek information. For example, you could find out how to use the ls command like this:
man lsMost of the time, the man command will open a man page in the less pager. (Some Unix implementations might use the more pager instead, but I haven’t found any recent ones that do.) Either way, you’ll be able to scroll through the man page or perform key word searches within the page to find the information that you seek.
The man pages are divided into sections that each correspond to a different category. On most Unix-like and Linux systems, there are eight main categories, more commonly referred to as sections, which are as follows:
Section number
Purpose
1
This section contains information about commands that can be used by any unprivileged user.
2
This section contains information about system calls, which are mainly of interest to software developers.
3
In this section, you’ll find information about library functions, which will also mainly be of interest to software developers.
4
If you’ve ever wanted to find information about the device files in the /dev/ directory, this is the place to look. This section also contains information about device drivers.
5
Here you’ll find information about the various configuration and system files on your system.
6
This is for information about games and screensavers. There’s normally not much here.
7
This is for information about miscellaneous things that don’t fit neatly into any of the other categories.
8
This is for information about administrative commands and system daemons.
Table 1.1: Describing the man page sections
You’ll see the subdirectories that contain these man page files in the /usr/share/man/ directory. You also might see some subdirectories with names like man0p, man5p, or man8x. These subdirectories contain certain special-purpose man pages, which will differ on different Linux distros.
A lot of times, you won’t need to think about these sections, because the man command will pull up the proper man page for you. Other times, you will need to pay attention to these sections, because many key words for which you’ll search can be found in multiple sections. For example, here on the Fedora workstation that I’m using to write this, there are two man pages for printf. There are two ways to find them. First, you can use the man -aw command, like this:
[donnie@fedora ~]$ man -aw printf /usr/share/man/man1/printf.1.gz /usr/share/man/man3/printf.3.gz [donnie@fedora ~]$You can also use the whatis command, like this:
[donnie@fedora ~]$ whatis printf printf (1) - format and print data printf (3) - formatted output conversion [donnie@fedora ~]$Note that whatis is a synonym for man -f. You’ll get the same results with either command, but my own preference is to use whatis.
So, we have a printf man page in Section 1, which means that we have a normal user command that’s called printf. We also see a printf man page in Section 3, which means that there’s a library function that’s called printf. If you enter man printf, you’ll see the man page from Section 1. You’ll see that in the first line of the man page, which will look like this:
PRINTF(1) User Commands PRINTF(1)If you instead want to see the man page from Section 3, you’ll need to specify that in your command, like this:
man 3 printfTo broaden your search for all man pages that contain printf in either the title or the description of the man page, even if it’s embedded into another text string, use either apropos or man -k, like this:
[donnie@fedora ~]$ apropos printf asprintf (3) - print to allocated string BIO_printf (3ossl) - formatted output to a BIO BIO_snprintf (3ossl) - formatted output to a BIO BIO_vprintf (3ossl) - formatted output to a BIO BIO_vsnprintf (3ossl) - formatted output to a BIO curl_mprintf (3) - formatted output conversion dprintf (3) - formatted output conversion tpm2_print (1) - Prints TPM data structures fprintf (3) - formatted output conversion fwprintf (3) - formatted wide-character output conversion printf (1) - format and print data printf (3) - formatted output conversion . . . [donnie@fedora ~]$Again, either command will give you the same output, but my own preference has always been to use apropos.
Most of the time, your Linux system does a good job of keeping the man page index updated. Once in a while though, you’ll need to do it manually, like this:
[donnie@fedora ~]$ sudo mandb [sudo] password for donnie: Purging old database entries in /usr/share/man... Processing manual pages under /usr/share/man... Purging old database entries in /usr/share/man/ca... Processing manual pages under /usr/share/man/ca... . . . . . . Processing manual pages under /usr/local/share/man... 0 man subdirectories contained newer manual pages. 0 manual pages were added. 0 stray cats were added. 0 old database entries were purged. [donnie@fedora ~]$Okay, that about does it for the man page system. Let’s talk about the info system.
The info page system is newer, and was invented by Richard M. Stallman as part of the GNU Project. The unique part about it is that each info page contains hyperlinks that can lead you to additional pages of information. For example, to obtain information about the info system, enter info info. This info page contains a menu, which looks something like this:
* Menu: * Stand-alone Info:: What is Info? * Invoking Info:: Options you can pass on the command line. * Cursor Commands:: Commands which move the cursor within a node. . . . ., . . * Variables:: How to change the default behavior of Info. * Colors and Styles:: Customize the colors used by Info. * Custom Key Bindings:: How to define your own key-to-command bindings. * Index:: Global index.Each underlined item you see is a hyperlink to another page. With your cursor keys, move the cursor to the hyperlink that you want to see, and hit the Enter key. To see an info page for a specific command, such as ls, just do this:
info lsIf you need help with navigating through the info pages, just hit the H key to bring up a navigation menu.
And, that’s about it for the info pages. Let’s talk about on-line documentation.
The Linux Documentation Project has been around since almost forever, and is an invaluable resource. The best part about it is the Guides section, where you’ll find free-of-charge, full-length books about Linux and bash that you can download in a variety of formats. They’re all quite old, with the newest one having been last updated in 2014. For the Bash Guide for Beginners book and the Advanced Bash-Scripting book that you’ll find there, that doesn’t matter. The concepts in those two books are eternal, and haven’t really changed over the years. To see these books, go to https://tldp.org/guides.html.
If all else fails, just use your favorite search engine to find what you need to know about either scripting in general, or scripting on a particular operating system. You’ll find plenty of help, such as blog posts, YouTube videos, and official documentation. There are plenty of Linux-specific websites that offer help on various things, and it’s quite simple to find them.
Next, let’s talk about text editors.
To create your shell scripts, you’ll need a text editor that’s designed for Linux and Unix systems. You have plenty of choices, and which one you choose will depend upon several criteria:
Are you editing on a text-mode machine or on a desktop machine?What features do you need?What is your own personal preference?Text-mode text editors can be used on machines that don’t have a graphical user interface installed. The two most common text-mode text editors are nano and vim. The nano editor is installed by default on pretty much every Linux distro, and is quite easy to use. To use it, just type nano, followed by the name of the file that you want to either edit or create. At the bottom of the screen, you’ll see the list of available commands. To invoke a command, press the CTRL key, followed by the letter key that corresponds to the desired command.
The downside of using nano is that it doesn’t have the full range of features that you might want in a programmers’ text editor. You can see here that the implementation of nano on my Fedora workstation has color-coding for the syntax, but it doesn’t automatically format the code.
Figure 1.3: The nano text editor on my Fedora workstation
Note that on other Linux distros, nano might not even have color-coding.
My favorite text-mode editor is vim, which has features that would make almost any programmer happy. Not only does it have color-coded syntax highlighting, but it also automatically formats your code with proper indentations, as you see here:
Figure 1.4: The vim text editor on my Fedora workstation
In reality, indentation isn’t needed for bash scripting, because bash scripts work fine without it. However, the indentation does make code easier for humans to read, and having an editor that will apply proper indentation automatically is quite handy. Additionally, vim comes with a powerful search-and-replace feature, allows you to split the screen so that you can work on two files at once, and can be customized with a fairly wide selection of plug-ins. Even though it’s a text-mode editor, you can use the right-click menu from your mouse to copy and paste text if you’re remotely logged in to your server from a desktop machine or if you’re editing a local file on your desktop machine.
The older vi text editor is normally installed on most Linux distros by default, but vim often isn’t. On some distros, the vim command will work, even if vim isn’t actually installed. That’s because the vim command on them might be pointing to either vim-minimal or even to the old vi. At any rate, to install full-fledged vim on any Red Hat-type of distro, such as RHEL, Fedora, AlmaLinux, or Rocky Linux, just do:
sudo dnf install vim-enhancedTo install vim on Debian or Ubuntu, do:
sudo apt install vimAs much as I like vim, I do have to tell you that some users are a bit put off from using it, because they believe that it’s too hard to learn. That’s because the original version of vi was created back in the Stone Age of Computing, before computer keyboards had cursor keys, backspace keys, or delete keys. The old vi commands that you used to have to use instead of these keys have been carried over to the modern implementations of vim.
So, most vim tutorials that you’ll find will still try to teach you all of those old keyboard commands.
Figure 1.5: This photo of me was taken during the Stone Age of Computing, before computer keyboards had cursor keys, backspace keys, or delete keys.
However, on the current versions of vim that you’ll install on Linux and modern Unix-like distros such as FreeBSD and OpenIndiana, the cursor keys, backspace key, and delete key all work as they do on any other text editor. So, it’s no longer necessary to learn all of those keyboard commands that you would have had to learn years ago. I mean, you’ll still need to learn a few basic keyboard commands, but not as many as you had to before.
If you’re using a desktop machine, you can still use either nano or vim if you desire. But, there’s also a wide range of GUI-type editors available if you’d rather use one of them. Some sort of no-frills text editor, such as gedit or leafpad, is probably already installed on your desktop system. Some slightly fancier programmer’s editors, such as geany, kwrite, and bluefish, are available in the normal repositories of most Linux distros and some Unix-like distros. Your best bet is to play around with different editors to see what you like. Here’s an example of kwrite with color-coded syntax highlighting enabled:
Figure 1.6: The Kwrite text editor.
If you’re a Windows user, you’ll never want to create or edit a shell script on your Windows machine with a Windows text editor such as Notepad or Wordpad, and then transfer the script to your Linux machine. That’s because Windows text editors insert an invisible carriage return character at the end of each line. You can’t see them, but your Linux shell can, and will refuse to run the script. Having said that, you might at times encounter scripts that someone else created with a Windows text editor, and you’ll need to know how to fix them so that they’ll run on your Linux or Unix machine. That’s easy to do, and we’ll look at that in Chapter 7, Text Stream Filters-Part 2.
That’s about it for our overview of text editors for Linux. Let’s move on and talk about compiled versus interpreted programming languages.
Compiled programming consists of writing program code in a text editor, and then using a compiler to convert the text file into an executable binary file. Once that’s done, users of the program won’t be able to easily view the source code of the program. With interpreted programming, the program runs directly from a text file, without having to compile it first.
Compiled programming languages, such as C, C++, or Fortran, are good for when you need maximum performance from your programs. However, they can be fairly hard to learn, especially when it comes to the lower-level functions such as working with files. Interpreted languages might not offer quite as high a level of performance, but they are generally quite flexible, and generally easier to learn. Interpreted languages in general also offer a higher degree of portability between different operating systems. Shell scripting falls into the category of interpreted languages.
Here are some reasons why you might consider using an interpreted language:
When you are looking for a simple solution.When you need a solution that is portable. If you pay attention to portability concerns, you can write one script that will work on different Linux distros, as well as on Unix/Unix-like systems. That can come in handy if you’re working in a large corporation with a large network of mixed operating systems. (You might even find some larger corporations that are still running legacy Unix systems, such as AIX, HPUX, or SUNOS, alongside more modern implementations of Linux, BSD, or macOS.)And, here are some reasons why you might consider using a compiled language:
When the tasks require intensive use of system resources. This is especially true when speed is extremely important.When you are using math operations that require heavy number crunching.When you need complex applications.When your application has many sub-components with dependencies.When you want to create proprietary applications, and prevent users from viewing the application source code.When you think about it, pretty much every example of productivity, server, gaming, or scientific software falls into one or more of these categories, which means that they really should be built with compiled languages for best performance.
Okay, let’s now talk about sudo.
Some of the things you’ll do in this course will require you to have administrative privileges. While it’s possible and convenient to just log into the root command prompt, that’s something that I like to discourage as much as possible. For best security, and to get used to what you’d be doing in an enterprise setting, your best bet is to use sudo.
Modern Linux distros allow you to add yourself to an administrators’ group as you install the operating system. (That’s the wheel group on Red Hat-type systems, and the sudo group on Debian/Ubuntu-type systems.) To run a command that requires administrative privileges, just do something like this:
sudo nftables list rulesetYou’ll then be asked to enter the password for your own user account, rather than the one for the root user account.
That’s about all we need to say about this topic, so let’s summarize and move on to the next chapter.
In this chapter, I’ve laid a bit of the groundwork for what’s to come in the following chapters. We looked at what an operating system shell is, and why we would use one. Then, we looked at the various ways to find help, did a high-level overview of Linux text editors, and wrapped up with a discussion of compiled versus interpreted programming and a brief mention of why we want to use sudo to run administrative commands.
In the next chapter, we’ll begin looking at the various things that an operating system shell does for us. I’ll see you there.
Read this book alongside other users, Linux experts, and the author himself.
Ask questions, provide solutions to other readers, chat with the author via Ask Me Anything sessions, and much more. Scan the QR code or visit the link to join the community.
https://packt.link/SecNet
To fulfill its job as the interface between the user and the operating system kernel, a shell has to perform five different functions. These functions include interpreting commands, setting variables, enabling input/output redirection, enabling pipelines, and allowing customization of a user’s working environment. In this chapter, we’ll look at how bash and zsh interpret commands. As an added bonus, much of what we’ll cover in the next few chapters will also help you prepare for certain Linux certification exams, such as the Linux Professional Institute or CompTIA Linux+ exams.
Topics in this chapter include:
Understanding the structure of a commandExecuting multiple commands at onceRunning commands recursivelyUnderstanding the command historyEscaping and quotingTo follow along, you can use pretty much any Linux distro that you desire, as long as it’s running with either bash or zsh. Your best bet is to use a virtual machine instead of your production workstation, in case you accidentally delete or change something that you shouldn’t.
A handy thing to know for both real-life and any certification exams that you may take, is the structure of a command. Commands can consist of up to three parts, and there’s a certain order for the parts. Here are the parts and the order in which you’ll normally place them:
The command itselfCommand optionsCommand argumentsIf you plan to take a Linux certification exam, you’ll definitely want to remember this ordering rule. Later on though, we’ll see that some commands don’t always follow this rule.
There are two general types of option switches:
Single-letter options: For most commands, a single-letter option is preceded by a single dash. Most of the time, two or more single-letter options can be combined with a single dash.Whole-word options: For most commands, a whole word option is preceded by two dashes. Two or more whole-word options must be listed separately, because they can’t be combined with a single pair of dashes.To show you what we mean, check out this hands-on lab.
In this lab, we’ll be working with the humble ls utility. Options and arguments are optional for this utility, so we’ll get to see the different configurations for the command in this hands-on practice.
Let’s issue the naked ls command in order to see the files and directories that are in our current directory. [donnie@fedora ~]$ ls 4-2_Building_an_Alpine_Container.bak Public 4-2_Building_an_Alpine_Container.pptx pwsafe.key addresses.txt python_container alma9_default.txt rad-bfgminer alma9_future.txt ramfetch alma_link.txt read.me.first . . . . . . pCloudDrive yad-form.sh Pictures [donnie@fedora ~]$ Now, let’s add a single-letter option. We’ll use the -l option to show the files and directories with some of their characteristics. [donnie@fedora ~]$ ls -l total 40257473 -rw-r--r--. 1 donnie donnie 754207 Apr 5 16:13 4-2_Building_an_Alpine_Container.bak -rw-r--r--. 1 donnie donnie 761796 Apr 8 14:49 4-2_Building_an_Alpine_Container.pptx -rw-r--r--. 1 donnie donnie 137 Apr 2 15:05 addresses.txt -rw-r--r--. 1 donnie donnie 1438 Nov 2 2022 alma9_default.txt . . . . . . -rwxr--r--. 1 donnie donnie 263 May 16 15:42 yad-form.sh [donnie@fedora ~]$ Use the ls command with the -a option to see any hidden files or directories. (Hidden files or directories have names that begin with a period.) [donnie@fedora ~]$ ls -a . .pcloud .. pCloudDrive 4-2_Building_an_Alpine_Container.bak Pictures 4-2_Building_an_Alpine_Container.pptx .pki addresses.txt .podman-desktop alma9_default.txt .profile . . . . . . .mozilla .Xauthority Music .xscreensaver NetRexx .xsession-errors nikto yad-form.sh [donnie@fedora ~]$ Next, let’s combine the two options, so that we can see the characteristics of both the hidden and unhidden files and directories: [donnie@fedora ~]$ ls -la total 40257561 drwx------. 1 donnie donnie 2820 Jul 25 13:53 . drwxr-xr-x. 1 root root 12 Aug 9 2022 .. -rw-r--r--. 1 donnie donnie 137 Apr 2 15:05 addresses.txt -rw-------. 1 donnie donnie 15804 Jul 24 17:53 .bash_history -rw-r--r--. 1 donnie donnie 18 Jan 19 2022 .bash_logout -rw-r--r--. 1 donnie donnie 194 Apr 3 12:11 .bash_profile -rw-r--r--. 1 donnie donnie 513 Apr 3 12:11 .bashrc . . . . . . -rw-r--r--. 1 donnie donnie 9041 Feb 4 12:57 .xscreensaver -rw-------. 1 donnie donnie 0 Jul 25 13:53 .xsession-errors -rwxr--r--. 1 donnie donnie 263 May 16 15:42 yad-form.sh [donnie@fedora ~]$In the preceding examples, the donnie donnie part indicates that the files and directories belong to user donnie and are associated with the donnie group. In this example, we’re using a whole-word option, --author, preceded by two dashes, to view some extra information. Let’s use this --author switch and the -l switch together to see who authored these files:
[donnie@fedora ~]$ ls -l --author total 40257473 -rw-r--r--. 1 donnie donnie donnie 137 Apr 2 15:05 addresses.txt -rw-r--r--. 1 donnie donnie donnie 1438 Nov 2 2022 alma9_default.txt -rw-r--r--. 1 donnie donnie donnie 1297 Nov 2 2022 alma9_future.txt . . . . . . rwxr--r--. 1 donnie donnie donnie 263 May 16 15:42 yad-form.sh [donnie@fedora ~]$So, it appears that that Donnie character also created the files in the first place. (Oh, that’s me, isn’t it?)
An argument is an object upon which a command will operate. For the ls command, an argument would be the name of a file or directory. For example, let’s say that we want to see the details of just a certain file. We can do something like this:
[donnie@fedora ~]$ ls -l yad-form.sh -rwxr--r--. 1 donnie donnie 263 May 16 15:42 yad-form.sh [donnie@fedora ~]$We can use the *wildcard to see details of all files of a certain type, like so:
[donnie@fedora ~]$ ls -l *.sh -rwxr--r--. 1 donnie donnie 116 May 16 15:04 root.sh -rwxr--r--. 1 donnie donnie 263 May 16 15:42 yad-form.sh [donnie@fedora ~]$If you’re not familiar with the concept of wildcards, think of them as a way to perform pattern-matching. In the above example, the * wildcard is used to match one or more characters. For this reason, the ls -l *.sh command allows us to see all files with the .sh filename extension. You can also use this wildcard in other ways. For example, to see all filenames and directory names that begin with the letter w, just do this
donnie@opensuse:~> ls -ld w* drwxrwxr-x 1 donnie users 22 Mar 5 2022 windows -rw-r--r-- 1 donnie users 82180 Dec 7 2019 wingding.ttf drwxr-xr-x 1 donnie users 138 Mar 11 2023 wownero-x86_64-linux-gnu-v0.11 donnie@opensuse:~>For more information about wildcards, check out the reference in the Further Reading section.
In this case, we’re looking at all files whose names end in .sh.
You’re not always limited to specifying just one argument. In this example, we’re looking at three different files:
[donnie@fedora ~]$ ls -l missing_stuff.txt yad-form.sh Dylan-My_Back_Pages-tab.odt -rw-r--r--. 1 donnie donnie 29502 Mar 7 18:30 Dylan-My_Back_Pages-tab.odt -rw-r--r--. 1 donnie donnie 394 Dec 7 2022 missing_stuff.txt -rwxr--r--. 1 donnie donnie 263 May 16 15:42 yad-form.sh [donnie@fedora ~]$Use the -ld option to view the characteristics of a directory without viewing the contents of the directory, like so:
[donnie@fedora ~]$ ls -ld Downloads/ drwxr-xr-x. 1 donnie donnie 8100 Aug 4 12:37 Downloads/ [donnie@fedora ~]$Although you can actually change the order in which options and arguments appear in many commands, it’s bad practice to do so. To avoid confusion and to prepare yourself for any Linux certifications exams that you might take, just follow the ordering rule that I’ve presented here. That is, the command itself, command options, and lastly, the command arguments.
That about does it for the command structure part. Let’s move on to see how to execute multiple commands at once.
From either the command-line or from within shell scripts, it’s handy to know how to combine multiple commands into one single command. In this section, I’ll demonstrate three ways to do that which are:
Running commands interactivelyUsing command sequencesUsing the find utilityThis is a form of shell-script programming, except that you’re just executing all commands from the command-line, instead of actually writing, saving, and executing a script. Here, you are creating a for loop – with each command of the loop on its own separate line – to perform a directory listing three times.
[donnie@fedora ~]$ for var in arg1 arg2 arg3 > do > echo $var > ls > done . . . . . . [donnie@fedora ~]$At the end of each line, you’ll hit the Enter key. But, nothing will happen until you type the done command on the final line. The for loop will then run three times, once for each of the three listed arguments. Each time that it runs, the value of an argument gets assigned to the var variable, and the echo command prints the currently-assigned value. The output will look something like this:
arg1 4-2_Building_an_Alpine_Container.bak Public 4-2_Building_an_Alpine_Container.pptx pwsafe.key arg2 4-2_Building_an_Alpine_Container.bak Public 4-2_Building_an_Alpine_Container.pptx pwsafe.key arg3 4-2_Building_an_Alpine_Container.bak Public 4-2_Building_an_Alpine_Container.pptx pwsafe.keyNext, hit the up arrow key on your keyboard, and you’ll see the for loop that you just executed If you try this with bash, you’ll see that the individual commands are separated by semi-colons, like so:
[donnie@fedora ~]$ for var in arg1 arg2 arg3; do echo $var; ls; doneOn zsh, pressing the up arrow key will cause the command components to appear on their own separate lines, as you see here:
donnie@opensuse:~> for var in arg1 arg2 arg3 do echo $var ls doneEither way, the for loop will run again when you hit the Enter key.
If you’re still a bit unclear about how for loops work, have no fear. We’ll look at them in greater detail once we start actually creating shell scripts.
Command sequences are another type of programming structure that you’ll find very useful. Here, I’m demonstrating how to use them from the command-line so that you can grasp the basic concepts. In the upcoming chapters, I’ll show you examples of how to use them in shell scripts.
You can also use the semi-colon to separate stand-alone commands that you want to execute from the same command entry. If you wanted to cd to a certain directory and then look at its contents, you could enter each command on its own line. Or, you could enter them both on the same line. This process is called command chaining, which looks like this:
[donnie@fedora ~]$ cd /var ; ls account cache db ftp kerberos local log nis preserve spool yp adm crash empty games lib lock mail opt run tmp [donnie@fedora var]$ [donnie@fedora ~]$ cd /far ; ls bash: cd: /far: No such file or directory 4-2_Building_an_Alpine_Container.bak Public 4-2_Building_an_Alpine_Container.pptx pwsafe.key addresses.txt python_container alma9_default.txt rad-bfgminer . . . . . . [donnie@fedora ~]$The first command failed because I tried to cd into a non-existent directory. But, the second command still executed, which listed the files in my home directory.
You can also instruct bash or zsh to only execute the second command if the first command successfully completes. Just separate the commands with && instead of with a semi-colon, like this:
[donnie@fedora ~]$ cd /var && ls account cache db ftp kerberos local log nis preserve spool yp adm crash empty games lib lock mail opt run tmp [donnie@fedora var]$What if the first command doesn’t run successfully? Note here that the second command doesn’t execute:
[donnie@fedora ~]$ cd /far && ls bash: cd: /far: No such file or directory [donnie@fedora ~]$If you want bash or zsh to execute the second command only if the first command doesn’t run successfully, just separate the commands with ||. (This is a pair of pipe characters, which you’ll find on the same key as the backslash.) To illustrate, let’s again make a slight typo while trying to change directories.
[donnie@fedora ~]$ ce /var || echo "This command didn't work." bash: ce: command not found This command didn't work. [donnie@fedora ~]$ [donnie@fedora ~]$ cd /var || echo "This command didn't work." [donnie@fedora var]$For a more practical example, try changing to a directory, creating it if it doesn’t exist, and then changing to it after it’s been successfully created.
[donnie@fedora ~]$ cd mydirectory || mkdir mydirectory && cd mydirectory bash: cd: mydirectory: No such file or directory [donnie@fedora mydirectory]$You’ll still get an error message saying that the directory you tried to access doesn’t exist. But, look at the command prompt, and you’ll see that the directory has been created, and that you’re now in it.
We’ll now take a short intermission from our discussion of running multiple commands in order to introduce the find utility, which is truly the Cool-Mac Daddy of all search utilities. After this introduction, I’ll use find to show you more ways to run multiple commands at once.
Also, it would behoove us to mention that find isn’t just good for command-line searches. It’s also excellent for use within shell scripts, as you’ll see much later.
If you’re as old as I am, you might remember the Windows XP search pooch, which pranced around on your screen every time you did a file search from the Windows XP graphical search utility. It was cute, but it didn’t add to your search power. With the Linux find utility, you can perform powerful searches on just about any criterion you can think of, and then--from the same command-line entry--invoke another utility to do whatever you need to do with the search results. I won’t try to discuss every option there is for find, since there are so many. Rather, I’ll give you an overview of what you can do with find, and let you read its man page for the rest. (Just enter man find at the command-line to read about all of its options.)
In order to perform the most basic of searches, you’ll need to specify two things:
The search path: You can perform a search in either a specific path, or the entire filesystem. Since find is inherently recursive, the search will automatically extend to all of the subdirectories that are beneath of the directory that you specify. (Of course, you can also add command switches that limit the depth of the search.)What you’re searching for: There are a lot of ways that you can specify this. You can search for files of a specific name, and decide whether to make the search case-sensitive. You can also use wildcards, or search for files with certain characteristics or that are of a certain age. Or, you can combine multiple criteria for even more specific searches. The main thing that limits you is your own imagination.So now, let’s say that you want to search the entire filesystem for all files whose names end in .conf. You’ll want to use either the -name or the -iname switch in front of the file description that you want to search for. Otherwise, you’ll get a jumbled up mess of every directory listing that you’ve searched, with the information you’re looking for mixed in. For case-sensitive searches, use -name, and for case-insensitive searches, use -iname. In this case, we’ll use -iname, since we want to make the search case-insensitive.
I know, I’ve told you previously that most whole-word option switches are preceded by a pair of dashes. The find utility is an exception to the rule, because its whole-word option switches are preceded by only a single dash.
Also, be aware that searching through an entire filesystem on a production server with very large drives can take a long time. It’s sometimes necessary to do that, but it’s best to confine your searches to specific directories whenever possible.
If you include a wildcard character with a search criterion, you’ll need to enclose that search criterion in quotes. That will keep the shell from interpreting the wildcard character as an ambiguous file reference. For example, to perform a case-insensitive search through the current working directory and all of its subdirectories for all files with names ending with a .conf filename extension, I would do this:
[donnie@fedora ~]$ find -iname '*.conf' ./.cache/containers/short-name-aliases.conf ./.config/lxsession/LXDE/desktop.conf ./.config/pcmanfm/LXDE/desktop-items-0.conf ./.config/pcmanfm/LXDE/pcmanfm.conf ./.config/lxterminal/lxterminal.conf ./.config/Trolltech.conf . . . . . . ./tor-browser/Browser/TorBrowser/Data/fontconfig/fonts.conf ./rad-bfgminer/example.conf ./rad-bfgminer/knc-asic/RPi_system/raspi-blacklist.conf ./something.CONF [donnie@fedora ~]$[donnie@fedora ~]$By using the -iname option, I was able to find files with names that ended in either .conf or .CONF. If I had used the -name option instead, I would only have found files with names that end in .conf.
Normally, you would specify the search path as the first component of the find command. In the GNU implementation of find that’s included on Linux-based operating systems, omitting the search path will cause find
