代码编织梦想


Part II Shell Scripting Basics


 


**************************************
Chapter 11 Basic Script Building
**************************************



Section : Using Multiple Commands
===================================



If you want to run two commands together, you can enter them on the same prompt line, separated with a semicolon:
$ date ; who


Section : Creating a Script File
===================================

When creating a shell script file, you must specify the shell you are using in the first line of the file.
Here’s the format for this:


#!/bin/bash


In a normal shell script line, the pound sign (#) is used as a comment line.
However, the first line of a shell script file is
a special case, and the pound sign followed by the exclamation point tells the shell what shell to run the script under.


After indicating the shell, commands are entered onto each line of the file, followed by a carriage return. 




$ ./test1
$ ls -l test1
$ chmod u+x test1



Section : Displaying Messages
=================================



$ echo This is a test
$ echo "This is a test to see if you're paying attention"
$ echo 'Rich says "scripting is easy".'


The echo command uses either double or single quotes to delineate text strings.
If you use them within your string, you need to use one type of quote within the text and the other type to delineate the string.




echo -n "The time and date are: "
echo a text string on the same line as a command output




Section  :  Using Variables
==============================



Environment variables:
----------------------


echo "User info for userid: $USER"
echo UID: $UID

Using the envi- ronment variable’s name preceded by a dollar sign.




$ echo "The cost of the item is \$15"
To display an actual dollar sign, you must precede it with a backslash character.

User variables
-----------------


1.can be any text string of up to 20 letters, digits, or an underscore character.
2.case sensitive
3.No spaces can appear between the variable, the equal sign, and the value
4.user variables can be referenced using the dollar sign


Command substitution
---------------------
Command substitution allows you to assign the output of a shell command to a variable.


two ways to assign the output of a command to a variable:


■ The backtick character (`)
testing='date'




■ The $() format
testing=$(date)




#!/bin/bash
# copy the /usr/bin directory listing to a log file 
today=$(date +%y%m%d)
ls /usr/bin -al > log.$today


The +%y%m%d format instructs the date command to display the date as a two-digit year, month, and day:
$ date +%y%m%d 
140131
$



Command substitution creates what’s called a subshell to run the enclosed command.
Because of that, any variables you create in the script aren’t available to the subshell command.

Subshells are also created if you run a command from the command prompt using the ./ path,
but they aren’t cre- ated if you just run the command without a path.




Section : Redirecting Input and Output
==========================================



Output redirection
--------------------
sending output from a command to a file:
command > outputfile


$ date > test6
$ date >> test6





Input redirection
-------------------
takes the content of a file and redi- rects it to a command:
  command < inputfile


$ wc < test6


The wc command provides a count of text in the data. By default, it produces three values:
■ The number of lines in the text
■ The number of words in the text
■ The number of bytes in the text




inline input redirection:
  Another method of input redirection,
  This method allows you to specify the data for input redirection on the command line instead of in a file.
  1.must specify a text marker that delineates the beginning and end of the data
  2.use any string value for the text marker
 
command << marker 
data
marker


$ wc << EOF
> test string 1
> test string 2
> test string 3 
> EOF
3 9 42 
$








Section : Pipes
===================

$ rpm -qa > rpm.list
$ sort < rpm.list


That was useful, but again,Instead of redirecting the output of a command to a file,
you can redirect the output to another command. This process is called piping.




command1 | command2
The pipe is put between the commands to redirect the output from one to the other
The Linux system actually runs both commands at the same time, linking them together internally in the system.
As the first command produces output, it’s sent immediately to the second command. 
No inter- mediate files or buffer areas are used to transfer the data.


$ rpm -qa | sort
$ rpm -qa | sort | more

you can use one of the text paging commands (such as less or more) to force the output to stop at every screen of data.


$ rpm -qa | sort > rpm.list
$ more rpm.list





Section : Performing Math
============================

There are two different ways to perform mathematical operations in your shell scripts.


The expr command:
------------------
The expr command recognizes a few different mathematical and string operators


$ expr 5 * 2
expr: syntax error 

$


$ expr 5 \* 2 
10

$
To solve this problem, you need to use the shell escape character (the backslash) to identify any characters that may be misinterpreted by the shell before being passed to the expr command.




Using brackets:
----------------
Using brackets makes shell math much easier than with the expr command.


In bash, when assigning a mathematical value to a variable, 
you can enclose the mathematical equa- tion 
using a dollar sign and square brackets ($[ operation ])。


#!/bin/bash
var1=100
var2=50
var3=45
var4=$[$var1 * ($var2 - $var3)] 
echo The final result is $var4



A  floating-point solution:
------------------------------
uses the built-in bash calculator, called bc.


You can access the bash calculator from the shell prompt using the bc command:
$ bc


To exit the bash calculator, you must enter quit.
The floating-point arithmetic is controlled by a built-in variable called scale. 


$ bc -q 
var1=10 
var1 * 4 40
var2 = var1 / 5 
print var2
2
quit
$



Yes, you can use the command substitution character to run a bc command and assign the output to a vari- able! 
The basic format to use is this:

variable=$(echo "options; expression" | bc)

options, allows you to set variables.more than one variable, separate them using the semicolon.




The bc command recognizes input redirection, allowing you to redirect a file to the bc command for processing.


The best method is to use inline input redirection, which allows you to redirect data directly from the command line. 
In the shell script, you assign the output to a variable:


variable=$(bc << EOF 
options
statements 
expressions
EOF
)



The EOF text string indicates the beginning and end of the inline redirection data.




$ cat test12
 #!/bin/bash
var1=10.46 
var2=43.67
 var3=33.2
  var4=71
var5=$(bc << EOF 
scale = 4
a1 = ( $var1 * $var2) 
b1 = ($var3 * $var4) 
a1 + b1
EOF
)
echo The final answer for this mess is $var5 
$





Section : Exiting the Script
================================

Every command that runs in the shell uses an exit status to indicate to the shell that it’s finished processing.


Checking the exit status:
-------------------------
Linux provides the $? special variable that holds the exit status value from the last com- mand that executed.
You must view or use the $? variable immediately after the command you want to check. 


$ echo $?


The exit command:
------------------


 
**************************************

Chapter  12 Using Structured Commands

**************************************


Section :  Working with the if-then Statement

===================================
 if command 
 then
commands
 fi
 
 if command  ; then
commands
 fi
 

 The bash shell if statement runs the command defined on the if line. 
  If the exit status of the command (see Chapter 11) is zero (the command completed successfully),
  the commands listed under the then section are executed. 
  If the exit status of the command is anything else, 
  the then commands aren’t executed, 
  and the bash shell moves on to the next com- mand in the script. 
 




$ cat test3.sh
#!/bin/bash
# testing multiple commands in the then section #
testuser=Christine
#
if grep $testuser /etc/passwd
then
echo "This is my first command"
echo "This is my second command"
echo "I can even put in other commands besides echo:" 
ls -a /home/$testuser/.b*
fi 
$



Section : Exploring the if-then-else Statement

=================================================
if command ; then 
commands #When the command in the if statement line returns with a zero exit status code
else
commands
fi




Section : Nesting ifs

========================
if command1 ; then
commands 
elif command2 ; then
more commands 
fi


Section: Trying the test Command

==================================
test condition

The test command provides a way to test different conditions in an if-then statement. 
If the condition listed in the test command evaluates to TRUE, the test command exits with a zero exit status code.
If the condition is FALSE, the test command exits with a non-zero exit status code




The bash shell provides an alternative way of testing a condition without declaring the test command in an if-then statement:
The square brackets define the test condition.
 Be careful; you must have a space after the first bracket and a space before the last bracket, or you’ll get an error message.


if [space condition space]
 then
commands
fi



The test command and test conditions can evaluate three classes of conditions:
■ Numeric comparisons
if [ $value1 -gt 5 ]
......

■ String comparisons
The test String Comparisons:
str1 = str2 Checks if str1 is the same as string str2
str1 != str2 Checks if str1 is not the same as str2
str1 < str2 Checks if str1 is less than str2
-n str1 Checks if str1 has a length greater than zero
-z str1 Checks if str1 has a length of zero


if [ $USER = $testuser ]
.....

if [ $val1 \> $val2 ]
.....

Test comparisons use standard ASCII ordering, using each character’s ASCII numeric value to determine the sort order. 
The sort command uses the sorting order defined for the system locale language settings. 
For the English language, the locale settings specify that lowercase letters appear before uppercase letters in sorted order.



The -n and -z comparisons are handy when trying to evaluate whether a variable contains data:


if [ -n $val1 ]
......






■ File comparisons
The test File Comparisons:
 
-d file ,Checks if file exists and is a directory
-e file, Checksiffileexists
-f file ,Checks if file exists and is a file
  -r file,Checks if file exists and is readable
-s file,
-w file,
-x file,
-O file,
-G file,
file1 -nt file2,Checks if file1 is newer than file2
file1 -ot file2,

if [ -d $jump_directory ]
......


Section : Considering Compound Testing

========================================
■ [ condition1 ] && [ condition2 ] 


if [ -d $HOME ] && [ -w $HOME/testing ]
......




■ [ condition1 ] || [ condition2 ]




Section: Working with Advanced if-then Features

================================================


Using double parentheses:
-------------------------
The double parentheses command allows you to incorporate advanced mathematical formulas in your comparisons.

(( expression ))


The test command allows for only simple arithmetic operations ;
The double parentheses command provides more mathematical symbols;




Using double brackets:
------------------------
The double bracket command provides advanced features for string comparisons:
[[ expression ]]

if [[ $USER == r* ]] #define a regular expression
......


Section : Considering the case Command

=========================================
The case command checks multiple values of a single variable in a list-oriented format:


case variable in
pattern1 | pattern2) commands1;;
 pattern3) commands2;;
*) default commands;;
esac


case $USER in
rich | barbara)
echo "Welcome, $USER"
echo "Please enjoy your visit";; 

testing)
echo "Special testing account";;

jessica)
echo "Do not forget to log off when you're done";; 

*)
echo "Sorry, you are not allowed here";; 


esac






**************************************

Chapter  13 More Structured Commands

**************************************

Section :   The for Command

===================================
Here’s the basic format of the bash shell for command:
The first iteration uses the first item in the list,


 for var in list 
 do
commands
 done

 
Reading values in a list
------------------------
for test in Alabama Alaska Arizona Arkansas California Colorado 
do
echo The next state is $test
done 



Reading complex values in a list
---------------------------------
■ Use the escape character (the backslash) to escape the single quotation mark.
■ Use double quotation marks to define the values that use single quotation marks.


for test in I don\'t know if "this'll" work 
do
echo "word:$test" 
done




for test in Nevada "New Hampshire" "New Mexico" "New York" 
do
echo "Now going to $test" 
done


Reading a list from a variable
--------------------------------
#!/bin/bash
# using a variable to hold the list
list="Alabama Alaska Arizona Arkansas Colorado"
list=$list" Connecticut"


for state in $list 
do
echo "Have you ever visited $state?" 
done




Reading values from a command
------------------------------
Notice that the states file includes each state on a separate line, 
The for command still iterates through the output of the cat command one line at a time, 
the for command still takes each word as a separate value.




#!/bin/bash
# reading values from a file
file="states"
for state in $(cat $file) 
do
echo "Visit beautiful $state" 
done




Changing the  field separator
-----------------------------
the special environment variable IFS, called the internal field separator. 
The IFS environment variable defines a list of characters the bash shell uses as field separators.
By default, the bash shell considers the following characters as field separators:
■ A space
■ A tab
■ A newline


To solve this problem, you can temporarily change the IFS environment variable values in your shell script to restrict the characters the bash shell recognizes as field separators.


IFS=$'\n'


This technique can be coded like this:
IFS.OLD=$IFS
IFS=$'\n'
#<use the new IFS value in code> 
IFS=$IFS.OLD


Suppose you want to iterate through values in a file that are separated by a colon (such as in the /etc/ passwd file):
IFS=:


If you want to specify more than one IFS character:
This assignment uses the newline, colon, semicolon, and double quotation mark characters as field separators. 
IFS=$'\n':;"




Reading a directory using wildcards
---------------------------------------
it’s perfectly legal to have directory and filenames that contain spaces.
 To accom- modate that, you should enclose the $file variable in double quotation marks.


#!/bin/bash
# iterate through all the files in a directory
for file in ~/* 
#for file in /home/rich/.b* /home/rich/badtest
do
if [ -d "$file" ] 
then
echo "$file is dir"
elif [ -f "$file" ] 
then
echo "$file is file"
fi 
done



Section : The C-Style for Command

====================================
shows you how to use a C-style for command in a bash shell script.


Here’s the basic format of the C-style bash for loop:


for (( variable assignment ; condition ; iteration process ))

#!/bin/bash
# testing the C-style for loop
for (( i=1; i <= 10; i++ )) 
do
echo "The next number is $i" 
done




#!/bin/bash
# multiple variables
for (( a=1, b=10; a <= 10; a++, b-- )) 
do
echo "$a - $b"
done






Section : The while Command

================================


Basic while format:
----------------------
Here’s fhe format of the while command:


while test command 
do
      other commands
done


var1=10
while [ $var1 -gt 0 ]
do
echo $var1
var1=$[ $var1 - 1 ]
done



Using multiple test commands:
-------------------------------
The while command allows you to define multiple test commands on the while statement,
Only the exit status of the last test command is used to determine when the loop stops.


#!/bin/bash
# testing a multicommand while loop
var1=10
while echo $var1
[ $var1 -ge 0 ] 
do
echo "This is inside
var1=$[ $var1 - 1 ] 
done



Section : The until Command

===============================
As long as the exit status of the test command is non-zero, the bash shell executes the commands listed in the loop. 
When the test command returns a zero exit sta- tus, the loop stops.
As you would expect, the format of the until command is:


you can have more than one test command in the until command statement. 
Only the exit status of the last command determines if the bash shell executes the other commands defined.


until test commands 
do
        other commands
done


#!/bin/bash
# using the until command
var1=100
until [ $var1 -eq 0 ] 
do
echo $var1
var1=$[ $var1 - 25 ] 
done







Section : Nesting Loops

==========================


Section : Looping on File Data

===============================
This requires combining two of the techniques covered:
■ Using nested loops
■ Changing the IFS environment variable




Section : Controlling the Loop

==================================
■ The break command
■ The continue command




Breaking out of an inner loop
------------------------------
the break command automatically terminates the innermost loop you’re in






Breaking out of an outer loop
-------------------------------

break n
where n indicates the level of the loop to break out of. 
By default, n is 1, indicating to break out of the current loop.




continue n
where n defines the loop level to continue




Section : Processing the Output of a Loop

===============================================
you can either pipe or redirect the output of a loop within your shell script.
 You do this by adding the processing command to the end of the done command:


for file in /home/rich/* 
do
...
done > output.txt





for state in "North Dakota" Connecticut Illinois Alabama Tennessee 
do
echo "$state is the next place to go" 
done | sor





Section : example

====================


Creating multiple user accounts
---------------------------------


#!/bin/bash
# process new user accounts
input="users.csv"
while IFS=',' read -r userid name 
do
echo "adding $userid"
useradd -c "$name" -m $userid 
done < "$input"
$


**************************************

Chapter  14 Handling User Input

**************************************


Section :    Passing Parameters

===================================
Reading parameters:
--------------------


The bash shell assigns special variables, called positional parameters, to all of the command line parameters entered.     
The positional parameter variables are standard numbers, 
with $0 being the script’s name,
 $1 being the first parameter, 
 $2 being the second parameter, and so on, up to $9 for the ninth parameter.


$ cat test2.sh
#!/bin/bash
# testing two command line parameters #
total=$[ $1 * $2 ]
echo The first parameter is $1.
echo The second parameter is $2.
echo The total value is $total.
$




 After the ninth variable, you must use braces around the variable number, such as ${10}.


$ cat test4.sh
#!/bin/bash
# handling lots of parameters
#
total=$[ ${10} * ${11} ]




Reading the script name:
--------------------------


The basename command returns just the script’s name without the path:


# Using basename with the $0 parameter #
name=$(basename $0)
echo
echo The script name is: $name
#



#!/bin/bash
# Testing a Multi-function script #
name=$(basename $0)
#
if [ $name = "addem" ]
then
total=$[ $1 + $2 ] #
elif [ $name = "multem" ] 
then
 total=$[ $1 * $2
fi
#




$ cp
$ chmod u+x addem $
$ ln -s test6.sh multem $
$ ls -l *em




Testing parameters;
--------------------
$ cat test7.sh
#!/bin/bash
# testing parameters before use #
if [ -n "$1" ]
then
echo Hello $1, glad to meet you. 
else
echo "Sorry, you did not identify yourself. " 
fi
$


Section : Using Special Parameter Variables

=============================================


Counting parameters
--------------------
The special $# variable contains the number of command line parameters included when the script was run
#!/bin/bash
# getting the number of parameters
#
echo There were $# parameters supplied. 
$


#!/bin/bash
# Testing parameters #
if [ $# -ne 2 ]
.....



$ cat test10.sh
#!/bin/bash
# Grabbing the last parameter
#
params=$#
echo


echo The last parameter is ${!#} //Not ${$#}
echo
#
     






Grabbing all the data
-----------------------
The $* variable takes all the parameters supplied on the command line as a single word.
the $* variable treats them all as one parameter.

The $@ variable, on the other hand, takes all the parameters supplied on the command line as separate words in the same string.
It allows you to iterate through the values, separating out each parameter supplied.
  This is most often accomplished using the for command.
  
#!/bin/bash
# testing $* and $@ #
echo
count=1
#
for param in "$*" 
do
echo "\$* Parameter #$count = $param"
count=$[ $count + 1 ] 
done
#
echo
count=1
#
for param in "$@" 
do
echo "\$@ Parameter #$count = $param"
count=$[ $count + 1 ] 
done





Section :  Being Shifty

=========================
When you use the shift command, it moves each parameter variable one position to the left by default.


#!/bin/bash
# demonstrating  the shift command 
echo
count=1
while [ -n "$1" ]
 do
   echo "Parameter #$count = $1" 
   count=$[ $count + 1 ]
shift
done




$ cat test14.sh
#!/bin/bash
# demonstrating a multi-position shift
#
echo
echo "The original parameters: $*"
shift 2
echo "Here's the new first parameter: $1" 
$
$ ./test14.sh 1 2 3 4 5
The original parameters: 1 2 3 4 5 
Here's the new first parameter: 3 
$


$




 Section : Working with Options

 ================================
 
 Finding your options:
 ----------------------
Processing simple options


#!/bin/bash
# extracting command line options as parameters #
echo
while [ -n "$1" ]
do
case "$1" in
-a) echo "Found the -a option" ;;
-b) echo "Found the -b option" ;;
-c) echo "Found the -c option" ;;
*) echo "$1 is not an option" ;;
 esac
shift 
done


$
$ ./test15.sh -a -b -c -d


Using the getopt command
----------------------------


getopt optstring parameters


$ getopt ab:cd -a -b test1 -cd test2 test3 -a -b test1 -c -d -- test2 test3
$




Using getopt in your scripts
----------------------------
One of the set command options is the double dash (--).
 The double dash instructs set to replace the command line parameter variables with the values on the set command’s com- mand line.


set -- $(getopt -q ab:cd "$@")


#!/bin/bash
# Extract command line options & values with getopt #
set -- $(getopt -q ab:cd "$@")
......
#


Advancing to getopts
---------------------


getopts optstring variable


To suppress error messages, start the optstring with a colon.
The OPTARG environment variable contains the value to be used if an option requires a parameter value. 
The OPTIND environ- ment variable contains the value of the current location within the parameter list where getopts left off.




Section : Standardizing Options
==================================




Section : Getting User Input

==================================


Reading basics
---------------
the read command places the data into a variable.


The read command assigns all data entered at the prompt to a single variable, 
or you can specify multiple vari- ables.
You can also specify no variables on the read command line. 
If you do that, the read com- mand places any data it receives in the special environment variable REPLY:


#!/bin/bash
# testing the read command
#
echo -n "Enter your name: "
read name
echo "Hello $name, welcome to my program. " 
#


#!/bin/bash
# testing the read -p option
#
read -p "Please enter your age: " age 
days=$[ $age * 365 ]
echo "That makes you over $days days old! " 
#


Timing out
------------
The -t option specifies the number of seconds for the read command to wait for input.
When the timer expires, the read command returns a non-zero exit status


if read -t 5 -p "Please enter your name: " name
......

set the read command to count the input charac- ters. 
When a preset number of characters has been entered, it automatically exits, assigning the entered data to the variable


#!/bin/bash
# getting just one character of input
#
read -n1 -p "Do you want to continue [Y/N]? " answer 
case $answer in
Y | y) 
echo
echo "fine, continue on...";; 
N | n) 
echo
echo OK, goodbye
exit;; 
esac
echo "This is the end of the script" 
$


Reading with no display
-----------------------
The -s option prevents the data entered in the read command from being displayed on the monitor;


Reading from a file
----------------------
Each call to the read command reads a single line of text from the file.
 When no more lines are left in the file, the read command exits with a non-zero exit status
The most common method is to pipe the result of the cat command of the file directly to a while command that contains the read command.
#!/bin/bash 
cat test | while read line 
do
echo "Line $count: $line"
count=$[ $count + 1]
done



**************************************

Chapter  15 Presenting Data

**************************************


Section :   Understanding Input and Output

===================================


the cat command with data entered from STDIN: 
$ cat


the cat command to accept input from another file 
$ cat < testfile


Redirecting errors only:
$ ls -al badfile 2> test4


Redirecting errors and data:
$ ls -al test test2 test3 badtest 2> test6 1> test7


You can use this technique to separate normal script output from any error messages that occur in the script.


the &> symbol:
redirect both STDERR and STDOUT output to the same output file.

$ ls -al test test2 test3 badtest &> test7






Section : Redirecting Output in Scripts

=======================================
There are two methods for redirecting output in the script:


■ Temporarily redirecting each line
you can redirect an indi- vidual output line to STDERR. 
You just need to use the output redirection symbol to redi- rect the output to the STDERR file descriptor.
When you redirect to a file descriptor, you must precede the file descriptor number with an ampersand (&):

$ echo "This is an error message" >&2

$ ./test8 2> test9


■ Permanently redirecting all commands in the script


you can tell the shell to redirect a specific file descriptor for the duration of the script by using the exec command:

 
#!/bin/bash
# redirecting all output to a file 
exec 1>testout


echo "This is a test of redirecting all output"
echo "from a script to another file."
echo "without having to redirect every individual line"




#!/bin/bash
# redirecting output to different locations


exec 2>testerror


echo "This is the start of the script"
echo "now redirecting all output to another location"


exec 1>testout


echo "This output should go to the testout file"




Section : Redirecting Input in Scripts

=========================================
$ cat test12
#!/bin/bash
# redirecting file input


exec 0< testfile 


count=1
while read line
do
echo "Line #$count: $line" 
count=$[ $count + 1 ]
done






Section : Creating Your Own Redirection

==========================================


Creating output  file descriptors
---------------------------------
#!/bin/bash
# using an alternative file descriptor
exec 3>test13out
echo "This should display on the monitor"
echo "and this should be stored in the file" >&3




Redirecting  file descriptors
----------------------------
#!/bin/bash
# storing STDOUT, then coming back to it
exec 3>&1
exec 1>test14out



echo "This should store in the output file" echo "along with this line."


exec 1>&3


echo "Now things should be back to normal" 
$


Creating input  file descriptors:
---------------------------------
#!/bin/bash
# redirecting input file descriptors


exec 6<&0


exec 0< testfile


count=1
while read line do
echo "Line #$count: $line"
count=$[ $count + 1 ] 
done


exec 0<&6


read -p "Are you done now? " answer




Creating a read/write  file descriptor
-------------------------------------
#!/bin/bash
# testing input/output file descriptor


exec 3<> testfile


read line <&3
echo "Read: $line"
echo "This is a test line" >&3 







Closing  file descriptors
--------------------------
the shell automatically closes them when the script exits. 
or To close a file descriptor, redirect it to the special symbol &-.
exec 3>&-

#!/bin/bash
# testing closing file descriptors
exec 3> test17file
echo "This is a test line of data" >&3 
exec 3>&-
echo "This won't work" >&3
$




Section : Listing Open File Descriptors

=========================================
The lsof command lists all the open file descriptors on the entire Linux system. 


$ /usr/sbin/lsof -a -p $$ -d 0,1,2
-p, which allows you to specify a process ID (PID)
  -d, which allows you to specify the file descriptor numbers to display
  the special environment variable $$, which the shell sets to the current PID
  The -a option is used to perform a Boolean AND of the results of the other two options, 
  

Section : Suppressing Command Output
=====================================
you can redirect STDERR to a special file called the null file.
Any data that the shell outputs to the null file is not saved, thus the data are lost.


$ ls -al > /dev/null
$ cat /dev/null 
$



 it is often used by programmers to quickly remove data from an existing file without having to remove the file and re-create it:
$ cat /dev/null > testfile
$ cat testfile $




Section : Using Temporary Files

=================================
Linux uses the /tmp directory for files that don’t need to be kept indefinitely.
automatically remove any files in the /tmp directory at bootup.


The mktemp command allows you to easily create a unique temporary file in the /tmp folder.


Creating a local temporary  file
--------------------------------
To create a temporary file in a local directory with the mktemp command, you just need to specify a filename template. 
The template consists of any text filename, plus six X’s appended to the end of the filename:


$ mktemp testing.XXXXXX
$ ls -al testing*
-rw------- 1 rich rich 0 Oct 17 21:30 testing.UfIi13 
$




save that filename in a vari- able, so you can refer to it later on in the script:


#!/bin/bash
# creating and using a temp file
tempfile=$(mktemp test19.XXXXXX)
exec 3>$tempfile
echo "This script writes to temp file $tempfile"
echo "This is the first line" >&3




Creating a temporary  file in /tmp
----------------------------------
The -t option forces mktemp to create the file in the temporary directory of the system.


$ mktemp -t test.XXXXXX




#!/bin/bash
# creating a temp file in /tmp
tempfile=$(mktemp -t tmp.XXXXXX)
echo "This is a test file." > $tempfile
echo "This is the second line of the test." >> $tempfile


Creating a temporary directory
-----------------------------
The -d option tells the mktemp command to create a temporary directory instead of a file.


#!/bin/bash
# using a temporary directory
tempdir=$(mktemp -d dir.XXXXXX)
 cd $tempdir


tempfile1=$(mktemp temp.XXXXXX)






Section : Logging Messages

==========================
The tee command is a handy way to send output both to the standard output and to a log file.

tee filename

Because tee redirects data from STDIN, you can use it with the pipe command to redirect output from any command:


$ date | tee testfile
$ date | tee -a testfile   [append data to the file,]


Section : Example

=====================
#!/bin/bash
# read file and create INSERT statements for MySQL
outfile='members.sql'
IFS=','
while read lname fname address city state zip 
do
cat >> $outfile << EOF
INSERT INTO members (lname,fname,address,city,state,zip) VALUES ('$lname', '$fname', '$address', '$city', '$state', '$zip');
EOF

done < ${1}
$


The output redirection appends the cat command output to the file .
The input to the cat command is redirected from the standard input to use the data stored inside the script.






**************************************

Chapter  16 Script Control

**************************************


Section :  Handling Signals  

===================================
 The Ctrl+C key combination sends a SIGINT signal, 
  which simply stops the current process running in the shell.
 The Ctrl+Z key combination generates a SIGTSTP signal,
  stopping any processes running in the shell. 


view the stopped jobs using the ps command:
$ ps -l


  1.If you really want to exit the shell with a stopped job still active, just type the exit com- mand again.
The shell exits, terminating the stopped job.
  2.Alternately, now that you know the PID of the stopped job, you can use the kill command to send a SIGKILL signal to terminate it:
  $ kill -9 2456
 


Trapping signals
--------------------
The trap command allows you to specify which Linux signals your shell script can watch for and intercept from the shell. 


trap commands signals

#!/bin/bash
# Testing signal trapping
trap "echo ' Sorry! I have trapped Ctrl-C'" SIGINT
#
echo This is a test script
#

count=1
while [ $count -le 10 ] do
echo "Loop #$count" sleep 1
count=$[ $count + 1 ]
done
#
echo "This is the end of the test script" 
#




Trapping a script exit
-----------------------
#!/bin/bash
trap "echo Goodbye..." EXIT


count=1
while [ $count -le 5 ] 
do
echo "Loop #$count"
  sleep 1
count=$[ $count + 1 ]
done
#


Modifying or removing a trap
-----------------------------
#!/bin/bash
# Modifying a set trap
#
trap "echo ' Sorry... Ctrl-C is trapped.'" SIGINT
#
count=1
while [ $count -le 5 ] do
echo "Loop #$count" sleep 1
count=$[ $count + 1 ]
done
#modify a trap
trap "echo ' I modified the trap!'" SIGINT 

#


# Remove the trap
trap -- SIGINT





Section :  Running Scripts in Background Mode

===============================================


In background mode, a process runs without being associated with a STDIN, STDOUT, and STDERR on a terminal session


Running in the background
--------------------------
To run a shell script in background mode from the command line interface, 
just place an ampersand symbol (&) after the command:


$ ./test4.sh &

Running multiple background jobs
---------------------------------
each of the background processes is tied to the terminal session (pts/0) terminal. 
If the terminal session exits, the background process also exits.




Section : Running Scripts without a Hang-Up

===========================================
the nohup command:
start a shell script from a terminal session and let the script run in background mode until it finishes, even if you exit the terminal session.


$ nohup ./test1.sh &


Because the nohup command disassociates the process from the terminal, the process loses the STDOUT and STDERR output links.
automatically redirects STDOUT and STDERR messages to a file, called nohup.out.


Section : Controlling the Job

================================


The function of starting, stopping, killing, and resuming jobs is called job control. 


Viewing jobs
------------
The jobs command allows you to view the current jobs being handled by the shell:


$ jobs
[1]+ Stopped ./test10.sh
[2]- Running ./test10.sh > test10.out & 
$


view the various jobs’ PIDs by adding the -l parameter (lowercase L)


$ jobs -l
[1]+ 1897 Stopped ./test10.sh
[2]- 1917 Running ./test10.sh > test10.out & 
$


The job with the plus sign 
is considered the default job. It would be the job referenced by any job control commands if a job number wasn’t specified in the command line.
The job with the minus sign 
is the job that would become the default job when the current default job finishes processing. 






Using the kill command
to send a SIGHUP signal to the default process causes the job to terminate. 


$ kill 1955
$
[3]+ Terminated
$




Restarting stopped jobs:
-------------------------
ctrl+C
ctrl+Z


To restart a job <in background mode>, use the bg command:


$ ./test11.sh
^Z
$ bg   
#this restart the default job with +
#If you have additional jobs, you need to use the job number along with the bg command:
.....

$ jobs
[1]+ Running
$




Section : Being Nice

======================
By default, all processes started from the shell have the same scheduling priority on the Linux system.
The scheduling priority is an integer value, from -20 (the highest priority) to +19 (the low- est priority).


By default, the bash shell starts all processes with a scheduling priority of 0.


Using the nice command
-----------------------
The nice command allows you to set the scheduling priority of a command


$ nice -n 10 ./test4.sh > test4.out &


$ ps -p 4973 -o pid,ppid,ni,cmd


$ nice -10 ./test4.sh > test4.out &





Using the renice command
---------------------------
 change the priority of a command that’s already running on the system.
 
 specify the PID of a run- ning process to change its priority:
 
 $ renice -n 10 -p 5055
 $ ps -p 5055 -o pid,ppid,ni,cmd

 


Section : Running Like Clockwork

==================================


Scheduling a job using the at command
--------------------------------------
specify a time when the Linux system will run a script.


The at daemon, atd, runs in the background and checks the job queue for jobs to run.


By default, the atd daemon checks this directory every 60 seconds. 


By default, at jobs are submitted to the at job a queue.
Any output destined to STDOUT or STDERR is mailed to the user via the mail system.
command format:


at [-f filename] time

time formats:
■ A standard hour and minute, such as 10:15
■ An AM/PM indicator, such as 10:15PM
■ A specific named time, such as now, noon, midnight, or teatime (4PM)
■ A standard date format, such as MMDDYY, MM/DD/YY, or DD.MM.YY
....


Listing pending jobs
----------------------
The atq command allows you to view what jobs are pending on the system:


Removing jobs
---------------
the atrm command to remove a pending job:


Scheduling regular scripts
--------------------------
the cron program to allow you to schedule jobs that need to run on a regular basis. 
The cron program runs in the background and checks special tables, called cron tables, for jobs that are scheduled to run.


The format for the cron table is:
min hour dayofmonth month dayofweek command


if you want to run a command at 10:15 on every day, you would use this cron table entry:


15 10 * * * command

00 12 * * * if [`date +%d -d tomorrow` = 01 ] ; then ; command
This checks every day at 12 noon to see if it’s the last day of the month, and if so, cron runs the command


15 10 * * * /home/rich/test4.sh > test4out

Building the cron table
------------------------
Linux provides the crontab command for handling the cron table.
To list an existing cron table, use the -l parameter:
$ crontab -l
no crontab for rich 
$

To add entries to your cron table, use the -e parameter.


If anacron determines that a job has missed a scheduled running, it runs the job as soon as possible.


Starting scripts with a new shell
-----------------------------------


github的一个免费编程书籍列表-爱代码爱编程

Index AdaAgdaAlefAndroidAPLArduinoASP.NET MVCAssembly Language Non-X86 AutoHotkeyAutotoolsAwkBashBasicBETACC#C++ChapelCilkClojureCOBOLCoffeeScriptColdFusionCoolCoqDDartDB2Del

linux command line and shell scripting bible,3rd,parti_bad_2007_boy的博客-爱代码爱编程

PartI TheLinuxCommandLine  Chapter1 Starting with Linux Shells 略... -------------------------------------- Chapter2 Getting to the Shell 略... ---------

linux command line and shell scripting bible,3rd,part 3_bad_2007_boy的博客-爱代码爱编程

  Linux Command Line and Shell Scripting Bible,3rd,Part III ************************************** Chapter  17  Creating Functions *****************

(转) [it-ebooks]电子书列表_?briella的博客-爱代码爱编程

  [it-ebooks]电子书列表     [2014]: Learning Objective-C by Developing iPhone Games || Leverage Xcode and Objective-C to develop iPhone games http://it-ebo

免费的编程书-爱代码爱编程

为什么80%的码农都做不了架构师?>>>     Index AdaAgdaAndroidAPLArduinoASP.NET MVCAssembly Language Non-X86AutoHotkeyAutotoolsAwkBashBasicBETACC++ClojureCOBOLC

Awesome list of C++ GameDev project-爱代码爱编程

A curated list of awesome C++ (mainly) things for Game Development. Inspired by awesome-... stuff. ~2000 projects listed here! If you want to add projects here, do a push reques

[it-ebooks]电子书列表-爱代码爱编程

#### it-ebooks电子书质量不错,但搜索功能不是很好 #### 格式说明  [ ]中为年份      ||  前后是标题和副标题  #### [2014]: Learning Objective-C by Developing iPhone Games || Leverage Xcode and Objective-C to devel