error handling - Catching all bad signals for called commands in Bash script -
we creating bash script build server. want ensure when execute bash command inside script, returns signal of 0. if not, want execution stop. our solution far do:
#some command if [ $? -ne 0 ] ; #handle error fi
after every command cause problem. makes code quite long , doesn't seem elegant. use bash function, perhaps. although working $? can bit tricky, , still have call function after every command. there better way? i've looked @ trap command seems signal handling bash script writing, not commands call.
the robust, canonical way of doing is:
#!/bin/bash die() { local ret=$? echo >&2 "$*" exit "$ret" } ./configure || die "project can't configured" make || die "project doesn't build" make install || die "installation failed"
the fragile, convenient way of doing set -e
:
#!/bin/bash set -e # script exit in many (but not all) cases if command fails ./configure make make install
or equivalently (with custom error message):
#!/bin/bash # called many (but not all) commands fail trap 'ret=$?; echo >&2 "failure on line $lineno, exiting."; exit $ret' err ./configure make make install
for latter two, script not exit command part of conditional statement or &&
/||
, while:
backup() { set -e scp backup.tar.gz user@host:/backup rm backup.tar.gz } backup
will correctly avoid executing rm
if transfer fails, later inserting feature this:
if backup mail -s "backup completed successfully" user@example.com fi
will make backup
stop exiting on failure , accidentally delete backups.
Comments
Post a Comment