How to get the path of the current temporary working directory in nextflow - nextflow

I'm trying to create a file in the processes temporary directory that is checked by nextflow for output using the native scripting language (Groovy) of nextflow.
Here is a minimal example:
#!/usr/bin/env nextflow
nextflow.enable.dsl = 2
process test {
echo true
output:
file('createMe')
exec:
path = 'createMe'
println path
output = file(path)
output.append('exemplary file content')
}
workflow {
test()
}
Simply creating the file in the current directory would work when using python as the scripting language, but here it fails with this message:
Error executing process > 'test'
Caused by:
Missing output file(s) `createMe` expected by process `test`
Source block:
path = 'createMe'
println path
output = file(path)
output.append('exemplary file content')
Work dir:
/home/some_user/some_path/work/89/915376cbedb92fac3e0a9b18536809
Tip: view the complete command output by changing to the process work dir and entering the command `cat .command.out`
I also tried to set the path to workDir + '/createMe', but the actual working directory seems to be a subdirectory of that path.

There was actually an issue (#2628) opened a few days ago regarding this exact behavior. The solution is to use task.workDir to specify the task work directory:
This is caused by the fact the relative path is always resolved by the
Jvm against the main current launching directory.
Therefore the task work directory should be taken using the attribute
task.workDir e.g.
task.workDir.resolve('test.txt').text = "hello $world"
https://github.com/nextflow-io/nextflow/issues/2628#issuecomment-1034189393

Related

RSelenium makeFirefoxProfile with Windows Task Scheduler

I am navigating a web page with firefox using RSelenium package. When i start building my script i used makeFirefoxProfile function to create temporary profile for setting download directory and related file type to download needed file into specific directory.
When i was trying to do that i got an error about zip files. After some research I installed rtools and succesfully managed this error. My script worked as I expected.
Now i want to that operation periodically on Windows Machine. To do that When I try to use taskscheduleR packgage to create task for Windows Task Scheduler i got the some zip error due to windows doesnt have built in comman-line zip tool
You can check the error code below, after i tried to operate the task
Error in file(tmpfile, "rb") : cannot open the connection
Calls: makeFirefoxProfile -> file
In addition: Warning messages:
1: In system2(zip, args, input = input, invisible = TRUE) :
'"zip"' not found
2: In file(tmpfile, "rb") :
cannot open file 'C:\Users\user\AppData\Local\Temp\RtmpKCFo30\file1ee834ae3394.zip': No such file or directory
Execution halted
Within R-Studio when i run my script there is no problem. Thank you for your help

Perl - open with relative path Apache

We recently inherited a somewhat legacy perl application and we're working to migrate to a new server as well as setup a sandbox environment so we can start sorting the application out. The issue we're having is the code currently uses some relative paths for the open method. And currently works in their shared hosting environment (we have little access to)
open(HANDLE,"<../relative/path/to/file.txt")
We pulled all of the code, paths, etc. over and for the most part have the application up and running until we run into one of the scripts that does the above, opens a file with a relative path. Then it fails.
If we run the code via the command line, the relative paths work. If we modify the path to be the full path it works both via command line and through Apache (navigating to the page in the browser).
This makes me think there is some module or configuration option we need to set in Apache to allow for the perl scripts to access or use the open command with relative paths?
Relative paths are relative to the process's current work directory, which is not necessarily the same as a the directory containing the program. It fact, it's often / for daemons (and thus their children).
For paths relative to the program's location, use
use FindBin qw( $RealBin );
my $qfn = "$RealBin/relative/path/to/file.txt";
open(my $HANDLE, "<", $qfn)
As mentioned in the comment above. Our final solution was to use:
File::Basename qw( dirname );
chdir(dirname($0));
This allowed us to get the code working while also kept additional code modifications to a minimum.
I was running a script out of cgi-bin and wanted to open a template in htdocs/templates so I did this in my script...
# open the html template
my $filename = sprintf("%s/templates/test.tmplt", $ENV{DOCUMENT_ROOT});
my $template = HTML::Template->new(filename => $filename);
It is bad practice to specify file's path in open with a fixed string unless the path is predefined and never change - as for example with /etc/fstab in linux.
You should change the code to use variable(s) instead.
Define variable at the top of the script -- in feature if you need to change base and path you will know that you find it at the few first lines of the code.
In such situation add temporary in the code something as
use strict;
use warnings;
open( my $fh, '>', 'my_uniq_file.txt')
or die 'Couldn\'t open my_uniq_file.txt';
print $fh 'This directory is default base for path location';
close $fh;
Once you run your script from webserver you should look for the file my_uniq_tile.txt -- it's location will be default base of web server for the file(s) location.
Now adjust variables with file path accordingly.
use strict;
use warnings;
my $dir_project = './project_1/';
my $audio_data = 'audio_data.dat';
my $video_data = 'video_data.dat';
my $descr_data = 'descr_data.dat';
my $qfn = $dir_project . $audio_data;
open(my $fh, '<', $qfn)
or die "Couldn't open $qfn";
while( <$fh> ) {
chomp;
[do something with data]
}
close $fh;

write outputs from a script run into singularity

I can't get the output of a script run through singularity.
I have a python script, at the end of which the output is saved with:
...
with open('saveOut.pkl','wb') as myFile:
pickle.dump(myTable,myFile)
I want to run this script with singularity on a distant machine. Since I am learning singularity, I made a 'sand box' debian image (not compiled into a single 'img' file yet) in the directory /tmp/debian; in this image I copied the python script test.py in /usr/src and I run it with the command:
sudo singularity exec /tmp/debian python3.5 /usr/src/test.py
The problem:
It works well as long as I have only displayed results. with the pickle example described above, I don't get any saveOut.pkl file anywhere: this file is just not written anywhere but I don't see any message. I tried to write an explicit path in the python script. For instance /usr/src/saveOut.pkl, but this is the same.
How could I write a result ?
What was your expected result i.e. in which directory did you expect
to find the output file?
I expect a file saveOutput.pkl anywhere, in the container or not, I don't care the location. Currently I don't get it at all: neither in the container's current directory, nor in the container's /usr/src/, nor on the host, nor anywhere.
Did you look for it on the host or in the container?
both, I don't see it anywhere
What's happening here is that your python script is writing the pickle file to its current location (/usr/src/ in the container). Then, since the output from your script is not persistent (due to the sandbox not being writable on execution), it gets deleted at the end of the run.
I believe you could change your script:
with open('/opt/saveOut.pkl','wb') as myFile:
pickle.dump(myTable,myFile)
and then bind the local directory and get the output you're looking for:
sudo singularity exec -B ./:/opt /tmp/debian python3.5 /usr/src/test.py
This worked for me, anyway.

Elm Make starts Windows Script Host and gives error?

I'm playing in Elm, and whenever I use elm make I get an error from Windows Script Host:
The error states that there is an error in the build elm.js file. And when I look at the given line, it's the following:
return {
keys: keys, // A hash of key name to index
free: free, // An array of unkeyed item indices
} // line 10547
So I'm guessing it's bitching about the unneeded , on line 10546: free: free,.
But now my question is, why is Windows Script Host doing stuff with my build elm.js file anyway, and how can I fix it?
I already tried disabling Windows Script Host, but then I just get an error stating that it doesn't allow script to be run.
Since you have a file in that directory called elm.js, the Windows Command Line tries executing that when you type elm make. It thinks you are intending this: elm.js make.
You can get around this in a few ways:
Change the name of the output file from elm.js to something else
Move the generated elm.js file to a subdirectory
Run elm-make from the command line instead of elm make
Use another shell like Powershell, Git Bash, or Cygwin

How to set PATH variable in crontab via whenever gem

Is it possible to set the PATH or SHELL variable in a crontab via the whenever schedule.rb file?
# here I want to set the PATH and SHELL variable somehow
every 3.hours do
# some cronjob
end
I want this output in my crontab after my capistrano deploy:
SHELL=/bin/bash
PATH=/usr/local/bin:/usr/local/sbin:/sbin:/usr/sbin:/bin:/usr/bin:/usr/bin/X11
# some cronjobs
Ok, it seems as I found the solution. I found it here: https://gist.github.com/jjb/950975
I will update this answer when I have tested it
I have to put this into my schedule.rb
# If your ruby binary isn't in a standard place (for example if it's in /usr/local/bin,
# because you installed it yourself from source, or from a thid-party package like REE),
# this tells whenever (or really, the rails runner) where to find it.
env :PATH, '/bin:/sbin:/usr/bin:/usr/sbin:/usr/local/bin:/usr/local/sbin'
You are already doing it when running zenity when setting DISPLAY, LANG etc.
If you want to set the shell, set it in the first line of /home/username/script/script1.sh using #!/bin/bash.
If you want to set the path, one way to do it is to set it before running the command:
5 9-20 * * * PATH=/usr/local/bin:/usr/local/sbin:/sbin:/usr/sbin:/bin:/usr/bin:/usr/bin/X11 /home/username/script/script1.sh > /dev/null
A alternate/better way is to create a simple wrapper script like so:
#!/bin/bash
export PATH=/usr/local/bin:/usr/local/sbin:/sbin:/usr/sbin:/bin:/usr/bin:/usr/bin/X11
# Absolute path to this script
SCRIPT=`readlink -f $0`
# Absolute directory this script is in
SCRIPTPATH=`dirname $SCRIPT`
#make sure we are in the same directory as the script1.sh - this is useful in case the script assumes it is running from the same directory it's in and makes relative directory/file references
cd $SCRIPTPATH
##run final script, and pass through all parameters that were passed to wrapper script.
/home/username/script/script1.sh "$#"