How to pass bash script arguments to a subshell
2014-01
I have a wrapper script that does some work and then passes the original parameters on to another tool:
#!/bin/bash
# ...
other_tool -a -b "$@"
This works fine, unless the "other tool" is run in a subshell:
#!/bin/bash
# ...
bash -c "other_tool -a -b $@"
If I call my wrapper script like this:
wrapper.sh -x "blah blup"
then, only the first orginal argument (-x) is handed to "other_tool". In reality, I do not create a subshell, but pass the original arguments to a shell on an Android phone, which shouldn't make any difference:
#!/bin/bash
# ...
adb sh -c "other_tool -a -b $@"
Bash's printf
command has a feature that'll quote/escape/whatever a string, so as long as both the parent and subshell are actually bash, this should work:
#!/bin/bash
quoted_args="$(printf " %q" "$@")" # Note: this will have a leading space before the first arg
# echo "Quoted args:$quoted_args" # Uncomment this to see what it's doing
bash -c "other_tool -a -b$quoted_args"
Note that you can also do it in a single line: bash -c "other_tool -a -b$(printf " %q" "$@")"
Change $@
to $*
. I did a small local test and it works in my case.
#!/bin/sh
bash -c "echo $*"
bash -c "echo $@"
Saving as test.sh
and making it executable gives
$ ./test.sh foo bar
foo bar
foo
There is a subtle difference between $*
and $@
, as you can see. See e.g. http://ss64.com/bash/syntax-parameters.html
For the follow-up question in the comments: you need to escape e.g. white-space "twice" to pass a string with a separator as a combined argument, e.g. with test.sh
modified to a wc
wrapper:
#!/bin/sh
bash -c "wc $*"
This works:
$ touch test\ file
$ ./test.sh -l "test\ file"
0 test file
but:
$ ./test.sh -l "test file"
wc: test: No such file or directory
wc: file: No such file or directory
0 total
It's failing because you're coercing an array (the positional parameters) into a string. "$@"
is magical because it gives you each separate paramter as a properly quoted string. Adding additional text breaks the magic: "blah $@"
is just a single string.
This may get you closer:
cmd="other_tool -a -b"
for parm in "$@"; do cmd+=" '$parm'"; done
adb sh -c "$cmd"
Of course, any parameter that contains a single quote will cause trouble.
None of the solutions work well. Just pass x/\ \ \"b\"/aaaaa/\'xxx\ yyyy\'/zz\"offf\" as parameter and they fail.
Here is a simple wrapper that handles every case. Note how it escapes each argument twice.
#!/usr/bin/env bash
declare -a ARGS
COUNT=$#
for ((INDEX=0; INDEX<COUNT; ++INDEX))
do
ARG="$(printf "%q" "$1")"
ARGS[INDEX]="$(printf "%q" "$ARG")"
shift
done
ls -l ${ARGS[*]}
I have a script that I want to accept any number of command line arguments.
So far I have
if [ -O $1 ] ; then echo "you are the owner of $1" else echo "you are not the owner of $1" fi
Obviously if I wanted the script to only accept one argument this would work but what would work for ANY number of arguments.
ex. ./script f f1 f2 f3
One possible way to do what you want involves $@
, which is an array of all the arguments passed in.
for item in "$@"; do
if [ -O "$item" ]; then
echo "you are the owner of $item"
else
echo "you are not the owner of $item"
fi
done
Well, it depends on exactly what you want to do, but look into $*
, $@
, and shift
.
"$@" does not solve his problem of "ANY number of arguments". there is a limit in how long a commandline can be (http://www.in-ulm.de/~mascheck/various/argmax/). a better way to read in "unlimited arguments" is via STDIN:
prg_which_creates_arguments | while read a; do \
echo "do something with $a"; \
done
just create the arguments and pipe them one after another at the code which is doing something with them.