E.g., one I could use by adding a shebang to my Pascal files:
#!/usr/bin/env fpi
There's instantfpc at fpc's svn trunk:
http://svn.freepascal.org/svn/fpc/trunk/utils/instantfpc/
Here's the README:
instantfpc
==========
This tool allows to execute pascal programs as unix scripts.
A unix script starts with a shebang #! and the program to execute. For example
#!/usr/bin/env instantfpc
begin
writeln('It works');
end.
If you save the above file as test.pas and set the execute permission
(chmod a+x) you can execute the script simply with
./test.pas
Installation
============
1. Compile instantfpc.lpi using lazarus, lazbuild or via "fpc instantfpc.lpr"
2. Put the executable "instantfpc" in PATH, for example into
/usr/bin/instantfpc or ~/bin/instantfpc.
That's all.
Now you can execute pascal programs as scripts.
Related
I'd like to use Repast Simphony's batch runner in headless mode using an unrolled parameter file to avoid using a nxn experiment setup. I have managed to create a jar using the console and the parameter file, but cannot find a way to actually launch the model jar that was created, since the -run option does not appear to be valid when using custom unrolled parameter files.
Does anyone have some advice on how to proceed there?
Currently, there's no way to use a custom upf and have simphony chunk the upf and distribute and run those chunks on different hosts. You can use the -u / --upf arguments to include your custom upf in the payload and then run that on an HPC system via the slurm or PBS scheduler. There's instructions for that in the batch runs getting started doc.
Those HPC runs use a script that runs some chunk of the upf file individually. That might be a useful workaround here for you.
sed -n "$begin","$end"p "$paramFile" > localParamFile.txt
mkdir $instanceDir
cd $instanceDir
java -Xmx512m -cp "../lib/*" repast.simphony.batch.InstanceRunner \
-pxml ../scenario.rs/batch_params.xml \
-scenario ../scenario.rs \
-id $instance \
-pinput localParamFile.txt
The idea here is that sed is used to chunk the upf file beginning at line $begin and ending at line $end and write that into localParamFile.txt. Then the InstanceRunner is started and that iterates over each line in that file and performs a model run using each line as input.
You could adapt this and manually chunk your custom file and then run the InstanceRunner.
I'm using singularity to run python in an environnement deprived of python. I'm also running a mysql instance as explained by the IOWA state university (running an instance of mysql, and closing it when done).
For clarity, I'm using a bash script to open mysql, then do what i have to do (a python script) and close mysql, and it works fine. But Python's only way to stop if an error occured is sys.exit([value]) and this not only stops the python script, but also the bash script that ran it. This makes it impossible for me to manage the errors and close the instance of mysql if the python script exits.
My question is : Is there a way for me to execute a 'singularity instance stop mysql' while being in the python sandbox. Something to tell singularity "hey, this command here must be used on the host !" ?
I keep searching but can't find anything.
I only tried to execute it with subprocess like any other command, but it returned an error message because I don't have this instance inside the python sandbox. I don't even have singularity in this sandbox.
For any clarifications, just ask me, I'm trying to be clear but I'm pretty sure it's not very clear.
Thanks a lot !
Generally speaking, it would be a big security issue if a process could be initiated from inside a container (docker or singularity) but run in the host OS's namespace.
If the bash script is exiting on the python failure, it sounds like you're using set -e or #!/bin/bash -e. This causes the script to abort if any command returns non-zero. It's commonly recommended for safer processing, but can cause problems like this at times. To bypass that for the python step you can modify your script:
# start mysql, do some stuff
set +x # disable abort on non-zero return
python my_script.py
set -x # re-enable abort on non-zero
# shut down mysql, do other stuff
having a small problem where you can help me out. On our new cluster we use LMod as environmental module system.
Creating a Module TCL Script for OpenFOAM, a system-dependent bashrc file need to be loaded.
This is the TCL script which I am using on another module system, it works fine. I am not able to execute the "source" command line in Lmod, what I am missing here?
#%Module1.0#####################################################################
##
## modules software/openfoam_v1812
##
## /opt/software/openfoam/openfoamv1812/OpenFOAM-v1812
proc ModulesHelp { } {
global version modroot
puts stderr "software/OpenFOAM-v1812 - sets the Environment for OpenFOAM-v1812 (openfoam.com)"
}
module-whatis "Sets the environment for using OpenFOAM-v1812"
# for Tcl script use only
set VERSION v1812
set OpenFOAM_PATH /opt/software/openfoam/openfoam${VERSION}/OpenFOAM-${VERSION}
set FOAM_INST_DIR /opt/software/openfoam/openfoam${VERSION}
puts stdout "source /opt/software/openfoam/openfoam${VERSION}/OpenFOAM-${VERSION}/etc/bashrc;"
I am not an expert, but I have recently come across a similar problem, in my case for activating Anaconda Python in a model. In my case, the solution was to use the 'execute' command in LMod
https://lmod.readthedocs.io/en/latest/050_lua_modulefiles.html
which has the documentation:
execute {cmd=”<any command>”,modeA={“load”}}
Run any command with a certain mode. For example execute {cmd=”ulimit
-s unlimited”,modeA={“load”}} will run the command ulimit -s unlimited as the last thing that the loading the module will do.
Hope this helps
I can't get the output of a script run through singularity.
I have a python script, at the end of which the output is saved with:
...
with open('saveOut.pkl','wb') as myFile:
pickle.dump(myTable,myFile)
I want to run this script with singularity on a distant machine. Since I am learning singularity, I made a 'sand box' debian image (not compiled into a single 'img' file yet) in the directory /tmp/debian; in this image I copied the python script test.py in /usr/src and I run it with the command:
sudo singularity exec /tmp/debian python3.5 /usr/src/test.py
The problem:
It works well as long as I have only displayed results. with the pickle example described above, I don't get any saveOut.pkl file anywhere: this file is just not written anywhere but I don't see any message. I tried to write an explicit path in the python script. For instance /usr/src/saveOut.pkl, but this is the same.
How could I write a result ?
What was your expected result i.e. in which directory did you expect
to find the output file?
I expect a file saveOutput.pkl anywhere, in the container or not, I don't care the location. Currently I don't get it at all: neither in the container's current directory, nor in the container's /usr/src/, nor on the host, nor anywhere.
Did you look for it on the host or in the container?
both, I don't see it anywhere
What's happening here is that your python script is writing the pickle file to its current location (/usr/src/ in the container). Then, since the output from your script is not persistent (due to the sandbox not being writable on execution), it gets deleted at the end of the run.
I believe you could change your script:
with open('/opt/saveOut.pkl','wb') as myFile:
pickle.dump(myTable,myFile)
and then bind the local directory and get the output you're looking for:
sudo singularity exec -B ./:/opt /tmp/debian python3.5 /usr/src/test.py
This worked for me, anyway.
I am using a PC where perl script is not allowed. Is there any tool to convert perl script to vba macro?
Or is there any links where we can get the vba equavalent of perl statements.
Assuming you've got access to a machine that can run Perl, you could try using the PAR Packer utility (pp).
% pp -o hello hello.pl
# Pack 'hello.pl' into executable 'hello'
# (or 'hello.exe' on Win32)
If you have enough permissions to run VB scripts, you should also have permissions to install the Perl interpreter and just run your script natively.