I've been working with TCL for some time now, and I have spent a long time trying to do the following (it seems easy and I think it should be, but I can't get it right):
I need to execute an external program by means of a tcl script. For that, I use the exec command. For using this external program, I need to input a variable amount of files. If I called this program straight from a cmd window, it would be something like:
C:\>myprogram -i file1 -i file2 -i file3 (etc., etc.)
However, when trying to implement this in a dynamic/variable way through tcl I get into trouble. The way I do it is by storing in some variable myvar all the "-i filex" I need (done in a loop), and then pass that as a parameter to the exec command. It would look something like:
exec myprogram $myvar
Doing that apparently creates some issues, because this myprogram fails to "see" myvar. I'm guessing that there is some sort of hidden terminator or some clash of different types of arguments that makes that in the end the exec command "sees" only myprogram.
So, my question is, does anyone know how to insert variable arguments into a call to exec?
You can use {*} or eval. See this question for example.
Specializing for your case:
Tcl 8.5 (and later):
exec myprogram {*}$myvar
Tcl 8.4 (and before):
eval [list exec myprogram] [lrange $myvar 0 end]
# Or...
eval [linsert $myvar 0 exec myprogram]
That's right, the old version is ugly (or non-obvious, or both). Because of that, people tended to write this instead:
eval exec myprogram $myvar
but that was slower than expected (OK, not so relevant when running an external program!) and has hazards when $myvar isn't a canonically-formatted list due to the way that eval works. It used to catch out even experienced Tcl programmers, and that's why we introduced new syntax in 8.5, which is specified to be surprise-free and is pretty short too.
Related
I'm just switch to zsh and now adapting the alias in which was printing some text (in color) along with a command.
I have been trying to use the $fg array var, but there is a side effect, all the command is printed before being executed.
The same occur if i'm just testing a echo with a color code in the terminal:
echo $fg_bold[blue] "test"
]2;echo "test" test #the test is in the right color
Why the command print itself before to do what it's supposed to do ? (I precise this doesn't happen when just printing whithout any wariable command)
Have I to set a specific option to zsh, use echo with a special parameter to get ride of that?
Execute the command first (keep its output somewhere), and then issue echo. The easiest way I can think of doing that would be:
echo $fg[red] `ls`
Edit: Ok, so your trouble is some trash before the actual output of echo. You have some funny configuration that is causing you trouble.
What to do (other than inspecting your configuration):
start a shell with zsh -f (it will skip any configuration), and then re-try the echo command: autoload colors; colors; echo $fg_bold[red] foo (this should show you that the problem is in your configuration).
Most likely your configuration defines a precmd function that gets executed before every command (which is failing in some way). Try which precmd. If that is not defined, try echo $precmd_functions (precmd_functions is an array of functions that get executed before every command). Knowing which is the code being executed would help you search for it in your configuration (which I assume you just took from someone else).
If I had to guess, I'd say you are using oh-my-zsh without knowing exactly what you turned on (which is an endless source of troubles like this).
I don't replicate your issue, which I think indicates that it's either an option (that I've set), or it's a zsh version issue:
$ echo $fg_bold[red] test
test
Because I can't replicate it, I'm sure there's an option to stop it happening for you. I do not know what that option is (I'm using heavily modified oh-my-zsh, and still haven't finished learning what all the zsh options do or are).
My suggestions:
You could try using print:
$ print $fg_bold[red] test
test
The print builtin has many more options than echo (see man zshbuiltins).
You should also:
Check what version zsh you're using.
Check what options (setopt) are enabled.
Check your ~/.zshrc (and other loaded files) to see what, if any, options and functions are being run.
This question may suggest checking what TERM you're using, but reading your question it sounds like you're only seeing this behaviour (echoing of the command after entry) when you're using aliases...?
This question seems to be (very) stupid be I can't deal with it :(
When I tried this batch code:
if "%1" == "-i" (
set is = %2
echo. %is%
shift
)
called with 2 (or more) arguments, it does NOT work. It actually prints a blank. The "shift" command is not done either. When I watch the executed code (without the #echo off at the beginning), I can see that the "set" command is completed.
What's wrong with it?
Example of calling:
c:\script.bat -i test -d bla
You have two issues. By default group of statements in parens will have variable expansion done all at once, that is before your set command. Also the semantics for set is wrong, you don't want spaces around the =.
Add this to the top of your file:
setlocal ENABLEDELAYEDEXPANSION
and remove the spaces around = in set:
set is=%2
Finally used delayed expansion:
echo. !is!
A possible third issue is you may need two SHIFTs, one for -i, one for it's is argument.
Update
Thanks to #dbenham for pointing out that it wasn't a syntax error with set, it's just surprising behavior that deserves a little explanation. If you execute these commands:
set a=one
echo "%a%"
The result is:
"one"
That makes sense, but try:
set b = two
echo "%b%"
And you get:
"%b%"
What? This is what you would expect when environment var b is unset. But we just set it. Or did we:
echo "%b %"
Displays:
" two"
For the Windows set command, unlike any other language or environment I'm aware of, the spaces are significant. The spaces before the = become part of the environment var name, spaces after become part of the value. This uncommon behavior is a common source of errors when writing Windows batch programs.
I was originally trying to run an executable (tftpd32.exe) from Expect with the following command, but for some unknown reason it would hanged the entire script:
exec c:/tftpd32.351/tftpd32.exe
So, decided to call a batch file that will start the executable.
I tried to call the batch file with the following command, but get an error message stating windows cannot find the file.
exec c:/tftpd32.351/start_tftp.bat
I also tried the following, but it does not start the executable:
spwan cmd.exe /c c:/tftpd32.351/start_tftp.bat
The batch file contains this and it run ok when I double click on it:
start tftpd32.exe
Any help would be very much appreciated.
Thanks
The right way to run that program from Tcl is to do:
set tftpd "c:/tftpd32.351/tftpd32.exe"
exec {*}[auto_execok start] "" [file nativename $tftpd]
Note that you should always have that extra empty argument when using start (due to the weird way that start works; it has an optional string in quotes that specifies the window title to create, but it tends to misinterpret the first quoted string as that even if that leaves it with no mandatory arguments) and you need to use the native system name of the executable to run, hence the file nativename.
If you've got an older version of Tcl inside your expect program (8.4 or before) you'd do this instead:
set tftpd "c:/tftpd32.351/tftpd32.exe"
eval exec [auto_execok start] [list "" [file nativename $tftpd]]
The list command in that weird eval exec construction adds some necessary quoting that you'd have trouble generating otherwise. Use it exactly as above or you'll get very strange errors. (Or upgrade to something where you don't need nearly as much code gymnastics; the {*} syntax was added for a good reason!)
I have a MS SQL 2005 stored proc which takes an out parameter. How can I call this from a dos batch file and get the value of the out param? I know I have to use sqlcmd, but cant find anyting in there by which I can pass an out param and access its value in dos batch file.
Thanks
vikram
I do this kind of thing all the time (like so) with standard T-SQL, but you might be able to do something like this with a stored procedure if you edit the stored procedure to show a one line result set.
sqlcmd -b -S %COMPUTERNAME% -E -d %DBNAME% -Q "exec getXMLLocation;" -h-1
-o SearchResult.txt
set /p URI=<SearchResult.txt
#echo The XML file URI is: %URI%
In DOS, you will get any information returned in the standard out, but you cannot easily manipulate this. Must this be DOS? Is PowerShell an option, as you have more capabilities with PowerShell (heck even WSH is a better option for DOS if you need to store this value and not just show it in the command prompt).
Adding this based on comment this must be DOS. Here are my thoughts:
First, I would use the out macro statement to direct to stdout:
: out stdout
Once you have output in stdout, you can use DOS commands to direct it to variables you have set up in DOS. stdout is handle 1 in DOS.
The one issue I can think that might make this fail is if other items are cluttering up stdout. I would not want to parse through a lot of junk.
It seems as if a script with #! prefix can have the interpreter name and ONLY one argument. Thus:
#!/bin/ls -l
works, but
#!/usr/bin/env ls -l
doesn't
Do you agree? Any thoughts?
Francesc
Different Unixes interpret #! differently. Here's a comprehensive-looking writeup: http://www.in-ulm.de/~mascheck/various/shebang/
It seems that the lowest common denominator across platforms is "the interpreter (which must not itself be a script) and no more than one argument".
Originally, we only had one shell on Unix. When you asked to run a command, the shell would attempt to invoke one of the exec() system calls on it. It the command was an executable, the exec would succeed and the command would run. If the exec() failed, the shell would not give up, instead it would try to interpret the command file as if it were a shell script.
Then unix got more shells and the situation became confused. Most folks would write scripts in one shell and type commands in another. And each shell had differing rules for feeding scripts to an interpreter.
This is when the “#! /” trick was invented. The idea was to let the kernel’s exec () system calls succeed with shell scripts. When the kernel tries to exec () a file, it looks at the first 4 bytes which represent an integer called a magic number. This tells the kernel if it should try to run the file or not. So “#! /” was added to magic numbers that the kernel knows and it was extended to actually be able to run shell scripts by itself. But some people could not type “#! /”, they kept leaving the space out. So the kernel was expended a bit again to allow “#!/” to work as a special 3 byte magic number.