How would I go about using a variable in a vim shell command (one done with !) in a vimscript? For instance, something kind of like this: (just an example, not what I'm really trying to do)
function Ls(dir)
!ls a:dir
endfunction
Use the execute command. Everything after it is an expression that evaluates to a string, which it then executes like a command you had typed in yourself.
function Ls(dir)
execute '!ls ' . a:dir
endfunction
This says, "Evaluate the expression '!ls ' . a:dir and then execute it." The variable a:dir is expanded, the dot concatenates the two strings into '!ls whatever' and then that is executed as if you had typed it.
Related
I tried to use Robot to do the following operation in KSH to remove the ".auto" postfix in a directory:
Write for file in .auto; do mv $file ${file%.}; done
The ${file%.} is for KSH variable however Robot always considered it as Robot variable and gave the error message: "Resolving variable '${file%.}' failed: Variable '${file}' not found."
Is there any way to tell Robot that the ${file%.*} is not for a Robot variable?
If a string has something the framework may interpret as inline variable usage - escape it, with the \ char.
In your case, put it in front of the ${:
Write for file in .auto; do mv $file \${file%.}; done
One of my recipes in Yocto need to create a file containing a very specific line, something like:
${libdir}/something
To do this, I have the recipe task:
do_install() {
echo '${libdir}/something' >/path/to/my/file
}
Keeping in mind that I want that string exactly as shown, I can't figure out how to escape it to prevent bitbake from substituting in its own value of libdir.
I originally thought the echo command with single quotes would do the trick (as it does in the bash shell) but bitbake must be interpreting the line before passing it to the shell. I've also tried escaping it both with $$ and \$ to no avail.
I can find nothing in the bitbake doco about preventing variable expansion, just stuff to do with immediate, deferred and Python expansions.
What do I need to do to get that string into the file as is?
Bitbake seems to have particular issues in preventing expansion from taking place. Regardless of whether you use single or double quotes, it appears that the variables will be expanded before being passed to the shell.
Hence, if you want them to not be expanded, you need to effectively hide them from BitBake, and this can be done with something like:
echo -e '\x24{libdir}/something' >/path/to/my/file
This uses the hexadecimal version of $ so that BitBake does not recognise it as a variable to be expanded.
You do need to ensure you're running the correct echo command however. Under some distros (like Ubuntu), it might run the sh-internal echo which does not recognise the -e option. In order to get around that, you may have to run the variant of echo that lives on the file system (and that does recognise that option):
/bin/echo -e '\x24{libdir}/something' >/path/to/my/file
By default this task will be executed as shell function via /bin/sh, but it depends on your system what it will be as you can have a symlink named /bin/sh pointing to bash. The BitBake's manual prevents from using bashism syntax though.
You can consider just adding this task in your recipe as python function:
python do_install () {
with open('/path/to/your/file', 'a') as file:
file.write('${libdir}/something')
}
'a' stands for append.
This should eliminate the problem with variable expansion.
There is no standard way to escape these sorts of expressions that I am aware of, other than to try to break up the expression - accordingly this should work:
do_install() {
echo '$''{libdir}/something' >/path/to/my/file
}
The best solution is simply this:
bitbake_function() {
command $libdir/whatever
}
Bitbake will only expand ${libdir}; $libdir is passed through verbatim.
We don't have to worry about dollar signs that are not followed by {, and in this case, there is no need for libdir to be wrapped in braces.
The only time we run into a problem with just $foo is if we have something like ${foo}bar where the braces are required as delimiters so that bar isn't included into the variable name. In that situation, there are other solutions, such as for instance generating the shell syntax "$foo"bar. This is less cryptic than resorting to \x24.
If you need to use $ in variable assignment, remember that bitbake won't evaluate $whatever so you have to escape it for the underlying shell.
For instance I set gcc/ld Rpath option to use $ORIGIN keyword this way:
TARGET_LDFLAGS_append = " -Wl,-rpath-link=\\$$ORIGIN"
https://lists.yoctoproject.org/pipermail/yocto/2017-September/037820.html
You can define a variable to be a literal dollar sign.
DOLLAR = "$"
do_install() {
echo '${DOLLAR}{libdir}/something' >/path/to/my/file
}
no extra quoting required.
Is it possible to use a Linux environment variable inside a .sql file? I'm using the copy/select query to write to an output file, and I'll like to put that directory in a variable. So I want to do something like:
COPY (SELECT * FROM a)
TO $outputdir/a.csv
Outputdir would be set in my environment. Is this possible?
You can store the result of a shell command inside a psql variable like this:
\set afile `echo "$outputdir/a.csv"`
COPY (SELECT * FROM a) TO :'afile';
Another (better in my opinion) solution is to use only psql variables, see this answer of mine about psql variables, which is similar to your example. A example for your case would be:
\set outputdir '/path/to/output'
\set afile :outputdir '/a.csv'
COPY (SELECT * FROM a) TO :'afile';
Note that, in the example, you need to set the variable inside the script file, but you can skip the first line if you set it when you call psql:
psql --set=outputdir="$outputdir" <conn parameters> -f /path/to/yourscript.sql
This appears to work for your use case, provided you single quote the output file name as I mentioned. It will escape any double quotes as well contained within the SQL.
psql -c "$(eval echo '"' $(<envvars.sql | sed 's/"/\\"/g') '"')"
Of course, note that if your file contains any dollar quoted variables, the shell is going to try to interpret as a variable, and your script will break, so you will need to escape any dollar signs you need preserved literally with a backslash.
See also the second snippet in the accepted answer to this question for a possibly more robust answer.
The accepted answer is correct for PostgreSQL running on Unix. Under Windows a different incantation is required for obtaining the value of the environment variable from the CMD shell and for avoiding the carriage return returned by the echo command.
\set afile `set /p=%outputdir%/a.csv`
COPY (SELECT * FROM a) TO :'afile';
I've been working with TCL for some time now, and I have spent a long time trying to do the following (it seems easy and I think it should be, but I can't get it right):
I need to execute an external program by means of a tcl script. For that, I use the exec command. For using this external program, I need to input a variable amount of files. If I called this program straight from a cmd window, it would be something like:
C:\>myprogram -i file1 -i file2 -i file3 (etc., etc.)
However, when trying to implement this in a dynamic/variable way through tcl I get into trouble. The way I do it is by storing in some variable myvar all the "-i filex" I need (done in a loop), and then pass that as a parameter to the exec command. It would look something like:
exec myprogram $myvar
Doing that apparently creates some issues, because this myprogram fails to "see" myvar. I'm guessing that there is some sort of hidden terminator or some clash of different types of arguments that makes that in the end the exec command "sees" only myprogram.
So, my question is, does anyone know how to insert variable arguments into a call to exec?
You can use {*} or eval. See this question for example.
Specializing for your case:
Tcl 8.5 (and later):
exec myprogram {*}$myvar
Tcl 8.4 (and before):
eval [list exec myprogram] [lrange $myvar 0 end]
# Or...
eval [linsert $myvar 0 exec myprogram]
That's right, the old version is ugly (or non-obvious, or both). Because of that, people tended to write this instead:
eval exec myprogram $myvar
but that was slower than expected (OK, not so relevant when running an external program!) and has hazards when $myvar isn't a canonically-formatted list due to the way that eval works. It used to catch out even experienced Tcl programmers, and that's why we introduced new syntax in 8.5, which is specified to be surprise-free and is pretty short too.
In a Unix shell script, to call command foo on the original arguments to the script, you just write
foo $#
In Powershell, $args is an array of (typically) strings holding the current script's arguments. If you write
foo $args
will the same thing happen as in bash or other typical Unix shell script interpreters?
You want to use argument splatting which is new in PowerShell 2.0. The variable used for splatting can be either an array (args are bound positionally) or a hashtable (keys map to parameter names and their values provide the argument). If you don't declare any parameters then use #args. The problem with attempting to use $args is that it will be sent as an array to the first parmeter rather than splatted across all the parametets. However many folks do declare parameter. In this case you want to use #PSBoundParameters e.g.:
foo #PSBoundParameters