Hacker News new | past | comments | ask | show | jobs | submit login

Neither of those are equivalent to variable binding, which is what most SQL libraries provide, specifically because they don't actually solve the problem since they're still doing string substitution. Putting a double quotes in $1 in your "good" execute example will allow you break out of what's expected and then you're Bobby Tables.

Your python example at the bottom is correct, in that each separate element is more correct in that it allows each arg to be passed as an element, so there's no option to break out through quoting characters. SQL binds are like that in most ljbraries, even if they don't look like it. The parser knows a single item below there so if it passes it along as such. You cannot escape it in the same way.




I don’t really follow. My “good” example and the code at the bottom are the same.

sh is smarter than just doing string interpolation and ”$1” is passed on as a single argument, no matter what:

  > run(["sh", "-c", 'echo "$1"', "--", 'a"'])
  a”
Whereas if it were simple string interpolation, you’d see this:

  > run(["sh", "-c", 'echo "a""')
  --: 1: Syntax error: Unterminated quoted string
It’s the same special casing that gets "$@" right.


That requires you quote the param in the sintr to ensure that params are groups as expected. E.g.

    # cat pvars.sh
    #!/bin/bash
    echo "\$1=$1"
    echo "\$2=$2"
    echo "\$3=$3"
    # sh -c './pvars.sh "$1" $2 $3' -- 'a b' 'c d'
    $1=a b
    $2=c
    $3=d
The whole point of passing in an array and using something like exec (or system(), if provided as it handled the fork and wait for you) is that you avoid the overhead of the shell starting up at all and parsing the command line, and it lets you define each param exactly as needed since each param is its own array item. You don't need to worry about splitting on space or the shell splitting params on space, or quoting to group items. If you want the param to be:

    foo "bar baz" quux
as one singular parameter, you just make that the contents of that array item, since no parsing need be done at all.

If you have an array of params and you're jumping through hoops to make sure they're interpreted correctly by the shell you call execute a process, you're likely (depending on language and capabilities) wasting both cycles and over complicating the code when you can just call the program you actually want to execute directly and supply the params. Alternatively, if you have all the params as one long string and you want it to be pased as a shell would, then execute the shell and pass that as a param. e.g.

    # perl -E 'system("./pvars.sh","a b","c d");'
    $1=a b
    $2=c d
    $3=
    # perl -E 'system("./pvars.sh","a b","c","d");'
    $1=a b
    $2=c
    $3=d


Thanks for explaining. I feel like we’re talking past each other but it’s my mistake. I should have said it is only useful (not “particularly useful”) if one has compound statements like a pipe or multiple commands. Invoking sh just to run a single command is superfluous and you are right that reaching directly for execve() is better.


Ah, yes. If you want to take advantage of piping commands together through the shell as a subcommand of your program, then a way to make params behave more consistently regardless of content is useful.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: