Count number of files in directory then scp transfer a certain range such as 21404-42806 - scp

I found the number of files in /dev/shm/split/1/ to be 42806 using:
/bin/ls -lU /dev/shm/split/1/ | wc -l
What I can't seem to find anywhere online is how to select a certain range, say from 21404-42806, and use scp to securely copy those files. Then, for management purposes, I would like to move the files I copied to another folder, say /dev/shm/split/2/.
How do I do that using CentOS?
I tried:
sudo chmod 400 ~/emails/name.pem ; ls -1 /dev/shm/split/1/ | sed -n '21443,42806p' | xargs -i scp -i ~/emails/name.pem {} root#ipaddress:/dev/shm/split/2/
This produced:
no such file or directory
errors on all of the files...

ls itself lists files relative to the directory you give. This means your ls prints the filenames in the directory, but later on, scp doesn't have the path to them. You can fix this two ways:
Give the path to scp:
ls -1 /dev/shm/split/1/ | sed -n '21443,42806p' | xargs -i \
scp -i ~/emails/name.pem /dev/shm/split/1/{} root#ipaddress:/dev/shm/split/2/
Change to that directory and it will work:
cd /dev/shm/split/1/; ls -1 | sed -n '21443,42806p' | xargs -i \
scp -i ~/emails/name.pem {} root#ipaddress:/dev/shm/split/2/

Related

SSH command for find and replace

I am wanting to change the username and password in all my config files
What would the command be in SSH to find
$txpcfg['user'] = 'EXAMPLE-1';
And change it across my server to:
$txpcfg['user'] = 'EXAMPLE-2';
sed can be used with the -i flag to do in-line replacement.
Something like:
find /path/to/workingDir -name *.config -type f -print0 | xargs -0 sed -i 's/\$txpcfg\[\'user\'\]\ = \'$oldUser\';/\$txpcfg\[\'user\'\]\ \=\ \'$newUser\';/g'
That regex / search pattern can probably be cleaned up.

Secure copying files from a remote server to local machine from a list in a text file

I have about a thousand files on a remote server (all in different directories). I would like to scp them to my local machine. I would not want to run scp command a thousand times in a row, so I have created a text file with a list of file locations on the remote server. It is a simple text file with a path on each line like below:
...
/iscsi/archive/aat/2005/20050801/A/RUN0010.FTS
/iscsi/archive/aat/2006/20060201/A/RUN0062.FTS
/iscsi/archive/aat/2013/20130923/B/RUN0010.FTS
/iscsi/archive/aat/2009/20090709/A/RUN1500.FTS
...
I have searched and found someone trying to do a similar but not the same thing here. The command I would like to edit is below:
cat /location/file.txt | xargs -i scp {} user#server:/location
In my case I need something like:
cat fileList.txt | xargs -i scp user#server:{} .
To download files from a remote server using the list in fileList.txt located in the same directory I run this command from.
When I run this I get an error: xargs: illegal option -- i
How can I get this command to work?
Thanks,
Aina.
You get this error xargs: illegal option -- i because -i was deprecated. Use -I {} instead (you could also use a different replace string but {} is fine).
If the list is remote, the files are remote, you can do this to retrieve it locally and use it with xargs -I {}:
ssh user#server cat fileList.txt | xargs -I {} scp user#server:{} .
But this creates N+1 connections, and more importantly this copies all remote files (scattered in different directories you said) to the same local directory. Probably not what you want.
So, in order to recreate a similar hierarchy locally, let's say everything under /iscsi/archive/aat, you can:
use cut -d/ to extract the part you want to be identical on both sides
use a subshell to create the command that creates the target directory and copies the file there
Thus:
ssh user#server cat fileList.txt \
| cut -d/ -f4- \
| xargs -I {} sh -c 'mkdir -p $(dirname {}); scp user#server:/iscsi/archive/{} ./{}'
Should work, but that's starting to look messy, and you still have N+1 connections, so now rsync looks like a better option. If you have passwordless ssh connection, this should work:
rsync -a --files-from=<(ssh user#server cat fileList.txt) user#server:/ .
The leading / is stripped by rsync and in the end you'll get everything under ./iscsi/archive/....
You can also copy the files locally first, and then:
rsync -a --files-from=localCopyOfFileList.txt user#server:/ .
You can also manipulate that file to remove for example 2 levels:
rsync -a --files-from=localCopyOfFileList2.txt user#server:/iscsi/archive .
etc.

terminal mkdir with variable and subfolders

I have a text file "modules.txt" containing (individual module names):
dashboard
editor
images
inspector
loader
navigation
sharing
tags
toolbar
I want to create a folder structure for each module like:
dashboard/templates
editor/templates
flash/templates
images/templates
etc ...
I'm fiddling around in the area of:
cat modules.txt | xargs mkdir -p $1/templates
But this creates the first level of folders, ignoring the /templates part and giving and error:
mkdir: /templates: File exists
Which it does not.
I've tried all sort of combinations of:
cat modules.txt | xargs mkdir -p $1/{templates}
cat modules.txt | xargs mkdir -p $1{templates}
cat modules.txt | xargs mkdir -p $1/{templates}
cat modules.txt | xargs mkdir -p $1\/{templates}
(yes, pretty much guessing here)
I've also tried to add the /templates to each line in the text file, but that makes the whole thing crash.
Any ideas how to go about doing this?
Turns out, this did work when adding /templates to the text file. Must have been to tired (or stupid) when fiddling with this.
cat file_containing_folder_structure.txt | xargs mkdir -p

How to locate code in PHP inside a directory and edit it

I've been having problems with multiple hidden infected PHP files which are encrypted (ClamAV can't see them) in my server.
I would like to know how can you run an SSH command that can search all the infected files and edit them.
Up until now I have located them by the file contents like this:
find /home/***/public_html/ -exec grep -l '$tnawdjmoxr' {} \;
Note: $tnawdjmoxr is a piece of the code
How do you locate and remove this code inside all PHP files in the directory /public_html/?
You can add xargs and sed:
find /home/***/public_html/ -exec grep -l '$tnawdjmoxr' {} \; | xargs -d '\n' -n 100 sed -i 's|\$tnawdjmoxr||g' --
You may also use sed immediately than using grep -but- it can alter the modification time of that file and may also give some unexpected modifications like perhaps some line endings, etc.
-d '\n' makes it sure that every argument is read line by line. It's helpful if filenames has spaces on it.
-n 100 limits the number of files that sed would process in one instance.
-- makes sed recognize filenames starting with a dash. It's also commendable that grep would have it: grep -l -e '$tnawdjmoxr' -- {} \;
File searching may be faster with grep -F.
sed -i enables inline editing.
Besides using xargs it would also be possible to use Bash:
find /home/***/public_html/ -exec grep -l '$tnawdjmoxr' {} \; | while IFS= read -r FILE; do sed -i 's|\$tnawdjmoxr||g' -- "$FILE"; done
while IFS= read -r FILE; do sed -i 's|\$tnawdjmoxr||g' -- "$FILE"; done < <(exec find /home/***/public_html/ -exec grep -l '$tnawdjmoxr' {} \;)
readarray -t FILES < <(exec find /home/***/public_html/ -exec grep -l '$tnawdjmoxr' {} \;)
sed -i 's|\$tnawdjmoxr||g' -- "${FILES[#]}"

Remote rsync in parallel

I'm trying to run rsync over ssh in parallel to transfer files between two machines for evaluation purposes. I wanna see how faster can I get compared to a single rsync process.
I tried these two solutions:
https://wiki.ncsa.illinois.edu/display/~wglick/Parallel+Rsync but with no great success.
https://gist.github.com/rcoup/5358786 (I couldn't make it work)
Based on the first link I run a command like this:
ssh HOST "mkdir -p ~/destdir/basefolder"
cd ./basefolder; ls | xargs -n1 -P 4 -I% rsync -arvuz -e ssh % HOST:~/destdir/basefolder/.
and I get the files transfered, but it doesn't seem to work well... In this case, It will run a process for every file and folder in the basefolder, but when it finds a folder, it will transfer everything inside that folder using only 1 process.
I tried to use find -type f, but I got problems because I loose the file hierarchy.
Does anyone how some methods to do what I want? (Use rsync in parallel over ssh while keeping files and folders hierarchy).
Since you tagged your question 'gnu-parallel' the obvious is to refer you to http://www.gnu.org/software/parallel/man.html#EXAMPLE:-Parallelizing-rsync
cd src-dir; find . -type f -size +100000 | parallel -v ssh fooserver mkdir -p /dest-dir/{//}\;rsync -Havessh {} fooserver:/dest-dir/{}