I have a variable which is being used in two separate scripts: a C Shell one and a TCL one. Is there a way to define it just once and access it in both the scripts?
vars.sh
#!/usr/bin/env tcsh
set a=b
run.sh
#!/usr/bin/env tcsh
source vars.sh
echo $a
vars.tcl
#!/usr/bin/env tclsh
set a b
run.tcl
#!/usr/bin/env tclsh
source vars.tcl
puts $a
I do not like to idea of generating two separate files to store the same variables in two different formats. Is there a way to use a single vars file and have the variables available to both C Shell and TCL?
The simplest method is to make the variables be environment variables, since those are inherited by a child process from their parent. (On the Tcl side, they're elements in the ::env global array, and on the C shell side, they can be read like any other variable but need to be set via setenv.)
Sharing a single configuration file is much harder, since the two languages use a different syntax. Provided you don't use anything complicated in the way of quoting, you can make Tcl parse the C shell format.
proc loadVariablesFromCshellFile {filename arrayName} {
upvar 1 $arrayName array
set f [open $filename]
while {[gets $f line] >= 0} {
if {[regexp {^\s*set (\w+)=([""'']?)(.*)\2\s*$} $line -> key quote value]} {
set array($key) $value
}
}
close $f
}
This isn't complete code, since it doesn't handle substitution of variables inside the value, but it is enough to get you started. (I also hope you're not using that feature; if you are, portability is going to be quite a bit harder to achieve.) Here's how you'd use it:
#!/usr/bin/env tclsh
proc loadVariablesFromCshellFile {filename arrayName} {
upvar 1 $arrayName array
set f [open $filename]
while {[gets $f line] >= 0} {
if {[regexp {^\s*set (\w+)=([""'']?)(.*)\2\s*$} $line -> key quote value]} {
set array($key) $value
}
}
close $f
}
loadVariablesFromCshellFile vars.sh myvars
puts $myvars(a)
While it is entirely possible to load the values straight into scalar globals on the Tcl side, I really don't recommend it as it is polluting the global variable space from a source outside the Tcl program, a known piece of poor practice.
Related
I need to write a script for running a program with command in terminal with some variables (user defined). Such as;
./program VAR1 VAR2 VAR3.
However, my script must be in ZSH since it contains some loops that works well with ZSH. This is the code:
#!/usr/bin/env zsh
set VAR1=$1
set VAR2=$2
set VAR3=$3
for ((i = 0; i < 41; i++)); do
./program.csh $VAR1 $VAR2 $VAR3
done
echo program runs!
I know that this would work in CSH but I don't know how to convert this to ZSH.
Thanks for the help in advance!
The only real difference is that set is used to set shell options or positional parameters, not ordinary variables.
#!/usr/bin/env zsh
VAR1=$1
VAR2=$2
VAR3=$3
for ((i = 0; i < 41; i++)); do
./program.csh $VAR1 $VAR2 $VAR3
done
echo program runs!
You are really going to want to become more familiar with zsh if you plan on writing scripts in it; this is an extremely basic feature of all POSIX-style shells.
I'm trying to run a series of shell commands with Perl6 to the variable $cmd, which look like
databricks jobs run-now --job-id 35 --notebook-params '{"directory": "s3://bucket", "output": "s3://bucket/extension", "sampleID_to_canonical_id_map": "s3://somefile.csv"}'
Splitting the command by everything after notebook-params
my $cmd0 = 'databricks jobs run-now --job-id 35 --notebook-params ';
my $args = "'{\"directory\": \"$in-dir\", \"output\": \"$out-dir\",
\"sampleID_to_canonical_id_map\": \"$map\"}'"; my $run = run $cmd0,
$args, :err, :out;
Fails. No answer given either by Databricks or the shell. Stdout and stderr are empty.
Splitting the entire command by white space
my #cmd = $cmd.split(/\s+/);
my $run = run $cmd, :err, :out
Error: Got unexpected extra arguments ("s3://bucket", "output":
"s3://bucket/extension",
"sampleID_to_canonical_id_map":
"s3://somefile.csv"}'
Submitting the command as a string
my $cmd = "$cmd0\"$in-dir\", \"output\": \"$out-dir\", \"sampleID_to_canonical_id_map\": \"$map\"}'";
again, stdout and stderr are empty. Exit code 1.
this is something about how run can only accept arrays, and not strings (I'm curious why)
If I copy and paste the command that was given to Perl6's run, it works when given from the shell. It doesn't work when given through perl6. This isn't good, because I have to execute this command hundreds of times.
Perhaps Perl6's shell https://docs.perl6.org/routine/shell would be better? I didn't use that, because the manual suggests that run is safer. I want to capture both stdout and stderr inside a Proc class
EDIT: I've gotten this running with shell but have encountered other problems not related to what I originally posted. I'm not sure if this qualifies as being answered then. I just decided to use backticks with perl5. Yes, backticks are deprecated, but they get the job done.
I'm trying to run a series of shell commands
To run shell commands, call the shell routine. It passes the positional argument you provide it, coerced to a single string, to the shell of the system you're running the P6 program on.
For running commands without involving a shell, call the run routine. The first positional argument is coerced to a string and passed to the operating system as the filename of the program you want run. The remaining arguments are concatenated together with a space in between each argument to form a single string that is passed as a command line to the program being run.
my $cmd0 = 'databricks jobs run-now --job-id 35 --notebook-params ';
That's wrong for both shell and run:
shell only accepts one argument and $cmd0 is incomplete.
The first argument for run is a string interpreted by the OS as the filename of a program to be run and $cmd0 isn't a filename.
So in both cases you'll get either no result or nonsense results.
Your other two experiments are also invalid in their own ways as you discovered.
this is something about how run can only accept arrays, and not strings (I'm curious why)
run can accept a single argument. It would be passed to the OS as the name of the program to be run.
It can accept two arguments. The first would be the program name, the second the command line passed to the program.
It can accept three or more arguments. The first would be the program name, the rest would be concatenated to form the command line passed to the program. (There are cases where this is more convenient coding wise than the two argument form.)
run can also accept a single array. The first element would the program name and the rest the command line passed to it. (There are cases where this is more convenient.)
I just decided to use backticks with perl5. Yes, backticks are deprecated, but they get the job done.
Backticks are subject to code injection and shell interpolation attacks and errors. But yes, if they work, they work.
P6 has direct equivalents of most P5 features. This includes backticks. P6 has two variants:
The safer P6 alternative to backticks is qx. The qx quoting construct calls the shell but does not interpolate P6 variables so it has the same sort of level of danger as using shell with a single quoted string.
The qqx variant is the direct equivalent of P5 backticks or using shell with a double quoted string so it suffers from the same security dangers.
Two mistakes:
the simplistic split cuts up the last, single parameter into multiple arguments
you are passing $cmd to run, not #cmd
use strict;
my #cmd = ('/tmp/dummy.sh', '--param1', 'param2 with spaces');
my $run = run #cmd, :err, :out;
print(#cmd ~ "\n");
print("EXIT_CODE:\t" ~ $run.exitcode ~ "\n");
print("STDOUT:\t" ~ $run.out.slurp ~ "\n");
print("STDERR:\t" ~ $run.err.slurp ~ "\n");
output:
$ cat /tmp/dummy.sh
#!/bin/bash
echo "prog: '$0'"
echo "arg1: '$1'"
echo "arg2: '$2'"
exit 0
$ perl6 dummy.pl
/tmp/dummy.sh --param1 param2 with spaces
EXIT_CODE: 0
STDOUT: prog: '/tmp/dummy.sh'
arg1: '--param1'
arg2: 'param2 with spaces'
STDERR:
If you can avoid generating $cmd as single string, I would generate it into #cmd directly. Otherwise you'll have to implement complex split operation that handles quoting.
Let's say I have this script:
# prog.p6
my $info = run "uname";
When I run prog.p6, I get:
$ perl6 prog.p6
Linux
Is there a way to store a stringified version of the returned value and prevent it from being output to the terminal?
There's already a similar question but it doesn't provide a specific answer.
You need to enable the stdout pipe, which otherwise defaults to $*OUT, by setting :out. So:
my $proc = run("uname", :out);
my $stdout = $proc.out;
say $stdout.slurp;
$stdout.close;
which can be shortened to:
my $proc = run("uname", :out);
say $proc.out.slurp(:close);
If you want to capture output on stderr separately than stdout you can do:
my $proc = run("uname", :out, :err);
say "[stdout] " ~ $proc.out.slurp(:close);
say "[stderr] " ~ $proc.err.slurp(:close);
or if you want to capture stdout and stderr to one pipe, then:
my $proc = run("uname", :merge);
say "[stdout and stderr] " ~ $proc.out.slurp(:close);
Finally, if you don't want to capture the output and don't want it output to the terminal:
my $proc = run("uname", :!out, :!err);
exit( $proc.exitcode );
The solution covered in this answer is concise.
This sometimes outweighs its disadvantages:
Doesn't store the result code. If you need that, use ugexe's solution instead.
Doesn't store output to stderr. If you need that, use ugexe's solution instead.
Potential vulnerability. This is explained below. Consider ugexe's solution instead.
Documentation of the features explained below starts with the quote adverb :exec.
Safest unsafe variant: q
The safest variant uses a single q:
say qx[ echo 42 ] # 42
If there's an error then the construct returns an empty string and any error message will appear on stderr.
This safest variant is analogous to a single quoted string like 'foo' passed to the shell. Single quoted strings don't interpolate so there's no vulnerability to a code injection attack.
That said, you're passing a single string to the shell which may not be the shell you're expecting so it may not parse the string as you're expecting.
Least safe unsafe variant: qq
The following line produces the same result as the q line but uses the least safe variant:
say qqx[ echo 42 ]
This double q variant is analogous to a double quoted string ("foo"). This form of string quoting does interpolate which means it is subject to a code injection attack if you include a variable in the string passed to the shell.
By default run just passes the STDOUT and STDERR to the parent process's STDOUT and STDERR.
You have to tell it to do something else.
The simplest is to just give it :out to tell it to keep STDOUT. (Short for :out(True))
my $proc = run 'uname', :out;
my $result = $proc.out.slurp(:close);
my $proc = run 'uname', :out;
for $proc.out.lines(:close) {
.say;
}
You can also effectively tell it to just send STDOUT to /dev/null with :!out. (Short for :out(False))
There are more things you can do with :out
{
my $file will leave {.close} = open :w, 'test.out';
run 'uname', :out($file); # write directly to a file
}
print slurp 'test.out'; # Linux
my $proc = run 'uname', :out;
react {
whenever $proc.out.Supply {
.print
LAST {
$proc.out.close;
done; # in case there are other whenevers
}
}
}
If you are going to do that last one, it is probably better to use Proc::Async.
In TCL, I'm writing a procedure that returns the depth of a clock.
But since I have several clocks I want to name the var: depth_$clk
proc find_depth {} {
foreach clk $clocks {
…
set depth_$clk $max_depth
echo $depth_$clk
}
}
But I get:
Error: can't read "depth_": no such variable
Use error_info for more info. (CMD-013)
Your problem is this line:
echo $depth_$clk
The issue is that the syntax for $ only parses a limited set of characters afterwards for being part of the variable name; the $ is not part of that. Instead, you can use the set command with one argument; $ is effectively syntactic sugar for that, but the command lets you use complex substitutions.
echo [set depth_$clk]
HOWEVER!
The real correct thing to do here is to switch to using an associative array. It's a bit larger change to your code, but lets you do more as you've got proper access to substitutions in array element names:
proc find_depth {} {
foreach clk $clocks {
…
set depth($clk) $max_depth
echo $depth($clk)
}
}
echo ${depth}_$cell
This can help too.
Thanks !
I have a bunch of aliases that I would like to share with co-workers and I would like to put it in our project modulefile. Is there a script that would do the conversion for me? Or at least give me a good start and then I could fix the ones that didn't translate well?
P.S. Could someone with more rep create a modulefile tag?
I don't know of any tool that does the translation, but you can use something like this if the aliases are all one-liners:
Firstly, make a Tcl script like this, e.g., called convertalias.tcl:
while {[gets stdin line] >= 0} {
if {[regexp {^alias (\w+)='(.*)'$} -> name def]} {
puts [list set-alias $name $def]
} else {
puts stderr "Rejected line: $line"
}
}
Then use it in a bash command line like this (where bash$ is the prompt):
bash$ alias | tclsh convertalias.tcl >aliases.def
You'll then have to hack the aliases.def file, but it should give you a start. It will also print out any lines it couldn't grok (after all, it's just a stupid script...)