I have a text file "modules.txt" containing (individual module names):
dashboard
editor
images
inspector
loader
navigation
sharing
tags
toolbar
I want to create a folder structure for each module like:
dashboard/templates
editor/templates
flash/templates
images/templates
etc ...
I'm fiddling around in the area of:
cat modules.txt | xargs mkdir -p $1/templates
But this creates the first level of folders, ignoring the /templates part and giving and error:
mkdir: /templates: File exists
Which it does not.
I've tried all sort of combinations of:
cat modules.txt | xargs mkdir -p $1/{templates}
cat modules.txt | xargs mkdir -p $1{templates}
cat modules.txt | xargs mkdir -p $1/{templates}
cat modules.txt | xargs mkdir -p $1\/{templates}
(yes, pretty much guessing here)
I've also tried to add the /templates to each line in the text file, but that makes the whole thing crash.
Any ideas how to go about doing this?
Turns out, this did work when adding /templates to the text file. Must have been to tired (or stupid) when fiddling with this.
cat file_containing_folder_structure.txt | xargs mkdir -p
Related
I have about a thousand files on a remote server (all in different directories). I would like to scp them to my local machine. I would not want to run scp command a thousand times in a row, so I have created a text file with a list of file locations on the remote server. It is a simple text file with a path on each line like below:
...
/iscsi/archive/aat/2005/20050801/A/RUN0010.FTS
/iscsi/archive/aat/2006/20060201/A/RUN0062.FTS
/iscsi/archive/aat/2013/20130923/B/RUN0010.FTS
/iscsi/archive/aat/2009/20090709/A/RUN1500.FTS
...
I have searched and found someone trying to do a similar but not the same thing here. The command I would like to edit is below:
cat /location/file.txt | xargs -i scp {} user#server:/location
In my case I need something like:
cat fileList.txt | xargs -i scp user#server:{} .
To download files from a remote server using the list in fileList.txt located in the same directory I run this command from.
When I run this I get an error: xargs: illegal option -- i
How can I get this command to work?
Thanks,
Aina.
You get this error xargs: illegal option -- i because -i was deprecated. Use -I {} instead (you could also use a different replace string but {} is fine).
If the list is remote, the files are remote, you can do this to retrieve it locally and use it with xargs -I {}:
ssh user#server cat fileList.txt | xargs -I {} scp user#server:{} .
But this creates N+1 connections, and more importantly this copies all remote files (scattered in different directories you said) to the same local directory. Probably not what you want.
So, in order to recreate a similar hierarchy locally, let's say everything under /iscsi/archive/aat, you can:
use cut -d/ to extract the part you want to be identical on both sides
use a subshell to create the command that creates the target directory and copies the file there
Thus:
ssh user#server cat fileList.txt \
| cut -d/ -f4- \
| xargs -I {} sh -c 'mkdir -p $(dirname {}); scp user#server:/iscsi/archive/{} ./{}'
Should work, but that's starting to look messy, and you still have N+1 connections, so now rsync looks like a better option. If you have passwordless ssh connection, this should work:
rsync -a --files-from=<(ssh user#server cat fileList.txt) user#server:/ .
The leading / is stripped by rsync and in the end you'll get everything under ./iscsi/archive/....
You can also copy the files locally first, and then:
rsync -a --files-from=localCopyOfFileList.txt user#server:/ .
You can also manipulate that file to remove for example 2 levels:
rsync -a --files-from=localCopyOfFileList2.txt user#server:/iscsi/archive .
etc.
Using zsh 5.2 on Fedora 24 workstation.
I want to be programatically able to:
move an image file (can have jpg/ jpeg/ png/ JPG/ PNG extensions)
from /tmp/folder1 to ~/Pictures
This file will have the same few initial characters --- prefix111.jpg OR prefix222.png, etc.
rename the file such that samefilename.JPG becomes 20161013.jpg
20161013 is today's date in yyyymmdd format
Note that the extension becomes small letters
And JPEG or jpeg becomes jpg
change the permissions of the moved file to 644
All at one go.
If there are multiple prefix* files, the command should just fail silently.
I will initially like to do it at the command prompt with an option to add a cron job later. I mean, will the same zsh command/ script work in cron?
I am sure, this is doable. However, with my limited shell knowledge, could only achieve:
mv /tmp/folder1/prefix-*.JPG ~/Pictures/$(date +'%Y%m%d').jpg
Problems with my approach are many. It does not handle capitalization, does not take care of different extensions and does not address the permission issue.
How about this:
#!/bin/sh
FILES="/tmp/folder1/prefix*.jpg /tmp/folder1/prefix*.jpeg /tmp/folder1/prefix*.png h/tmp/folder1/prefix*.JPG /tmp/folder1/prefix*.PNG"
if [ $(ls $FILES | wc -l ) -gt 1 ]; then
exit 1
fi
if [ $(ls $FILES | grep -i '\.png$') ]; then
SUFF=png
else
SUFF=jpg
fi
DEST=$HOME/Pictures/$(date +'%Y%m%d').$SUFF
mv $FILES $DEST
chmod 644 $DEST
My problem is that I have a cluster-server with Torque PBS and want to use it to run a sequence-comparison with the program rapsearch.
The normal RapSearch command is:
./rapsearch -q protein.fasta -d database -o output -e 0.001 -v 10 -x t -z 32
Now I want to run it with 2 nodes on the cluster-server.
I've tried with: echo "./rapsearch -q protein.fasta -d database -o output -e 0.001 -v 10 -x t -z 32" | qsub -l nodes=2 but nothing happened.
Do you have any suggestions? Where I'm wrong? Help please.
Standard output (and error output) files are placed in your home directory by default; take a look. You are looking for a file named STDIN.e[numbers], it will contain the error message.
However, I see that you're using ./rapsearch but are not really being explicit about what directory you're in. Your problem is therefore probably a matter of changing directory into the directory that you submitted from. When your terminal is in the directory of the rapsearch executable, try echo "cd \$PBS_O_WORKDIR && ./rapsearch [arguments]" | qsub [arguments] to submit your job to the cluster.
Other tips:
You could add rapsearch to your path if you use it often. Then you can use it like a regular command anywhere. It's a matter of adding the line export PATH=/full/path/to/rapsearch/bin:$PATH to your .bashrc file.
Create a submission script for use with qsub. Here is a good example.
I found the number of files in /dev/shm/split/1/ to be 42806 using:
/bin/ls -lU /dev/shm/split/1/ | wc -l
What I can't seem to find anywhere online is how to select a certain range, say from 21404-42806, and use scp to securely copy those files. Then, for management purposes, I would like to move the files I copied to another folder, say /dev/shm/split/2/.
How do I do that using CentOS?
I tried:
sudo chmod 400 ~/emails/name.pem ; ls -1 /dev/shm/split/1/ | sed -n '21443,42806p' | xargs -i scp -i ~/emails/name.pem {} root#ipaddress:/dev/shm/split/2/
This produced:
no such file or directory
errors on all of the files...
ls itself lists files relative to the directory you give. This means your ls prints the filenames in the directory, but later on, scp doesn't have the path to them. You can fix this two ways:
Give the path to scp:
ls -1 /dev/shm/split/1/ | sed -n '21443,42806p' | xargs -i \
scp -i ~/emails/name.pem /dev/shm/split/1/{} root#ipaddress:/dev/shm/split/2/
Change to that directory and it will work:
cd /dev/shm/split/1/; ls -1 | sed -n '21443,42806p' | xargs -i \
scp -i ~/emails/name.pem {} root#ipaddress:/dev/shm/split/2/
I'm trying to run rsync over ssh in parallel to transfer files between two machines for evaluation purposes. I wanna see how faster can I get compared to a single rsync process.
I tried these two solutions:
https://wiki.ncsa.illinois.edu/display/~wglick/Parallel+Rsync but with no great success.
https://gist.github.com/rcoup/5358786 (I couldn't make it work)
Based on the first link I run a command like this:
ssh HOST "mkdir -p ~/destdir/basefolder"
cd ./basefolder; ls | xargs -n1 -P 4 -I% rsync -arvuz -e ssh % HOST:~/destdir/basefolder/.
and I get the files transfered, but it doesn't seem to work well... In this case, It will run a process for every file and folder in the basefolder, but when it finds a folder, it will transfer everything inside that folder using only 1 process.
I tried to use find -type f, but I got problems because I loose the file hierarchy.
Does anyone how some methods to do what I want? (Use rsync in parallel over ssh while keeping files and folders hierarchy).
Since you tagged your question 'gnu-parallel' the obvious is to refer you to http://www.gnu.org/software/parallel/man.html#EXAMPLE:-Parallelizing-rsync
cd src-dir; find . -type f -size +100000 | parallel -v ssh fooserver mkdir -p /dest-dir/{//}\;rsync -Havessh {} fooserver:/dest-dir/{}