How to combine scp and find in a single command? - ssh

I have a linux server from which i need to download files(zip) to my local machine that is generated for a particular filename using scp and find command, my filename syntax is like this
WSB_20230105_20230106_052320.zip.bz2 ,i need to download only the files with filenames containing "20230105".
I have tried thisĀ 
find . /app/weblogic/etsitf1/prestige/outbound/mis -type f -name "_20230105_20230106" -exec scp {} /tmp/tmp_dat_files \
which is not working for me . Please help
Thank you

Related

Fail to download file using SCP

I am trying to download a large number of files from a remote Ubuntu Server to my machine which is also running on Ubuntu. I am using SCP protocol as below:
for i in *; do $i sshpass -p 'Remote_Server_Passcode' scp root#<Remote_Server_IP>:'/opt/Data/' .; done
This is failing with an error message saying command not found
Any help pointing towards right direction will be highly helpful.
Thanks
If I understand correctly you just want to copy the whole /opt/Data directory, this can also be achieved like this:
scp -r root#<Remote_Server_IP>:/opt/Data/ .
-r means recursive
And as to what was going wrong the for i in *; do $i loops through all files in the current local directory and then tries to execute those, which is probably not what you wanted.

Secure copying files from a remote server to local machine from a list in a text file

I have about a thousand files on a remote server (all in different directories). I would like to scp them to my local machine. I would not want to run scp command a thousand times in a row, so I have created a text file with a list of file locations on the remote server. It is a simple text file with a path on each line like below:
...
/iscsi/archive/aat/2005/20050801/A/RUN0010.FTS
/iscsi/archive/aat/2006/20060201/A/RUN0062.FTS
/iscsi/archive/aat/2013/20130923/B/RUN0010.FTS
/iscsi/archive/aat/2009/20090709/A/RUN1500.FTS
...
I have searched and found someone trying to do a similar but not the same thing here. The command I would like to edit is below:
cat /location/file.txt | xargs -i scp {} user#server:/location
In my case I need something like:
cat fileList.txt | xargs -i scp user#server:{} .
To download files from a remote server using the list in fileList.txt located in the same directory I run this command from.
When I run this I get an error: xargs: illegal option -- i
How can I get this command to work?
Thanks,
Aina.
You get this error xargs: illegal option -- i because -i was deprecated. Use -I {} instead (you could also use a different replace string but {} is fine).
If the list is remote, the files are remote, you can do this to retrieve it locally and use it with xargs -I {}:
ssh user#server cat fileList.txt | xargs -I {} scp user#server:{} .
But this creates N+1 connections, and more importantly this copies all remote files (scattered in different directories you said) to the same local directory. Probably not what you want.
So, in order to recreate a similar hierarchy locally, let's say everything under /iscsi/archive/aat, you can:
use cut -d/ to extract the part you want to be identical on both sides
use a subshell to create the command that creates the target directory and copies the file there
Thus:
ssh user#server cat fileList.txt \
| cut -d/ -f4- \
| xargs -I {} sh -c 'mkdir -p $(dirname {}); scp user#server:/iscsi/archive/{} ./{}'
Should work, but that's starting to look messy, and you still have N+1 connections, so now rsync looks like a better option. If you have passwordless ssh connection, this should work:
rsync -a --files-from=<(ssh user#server cat fileList.txt) user#server:/ .
The leading / is stripped by rsync and in the end you'll get everything under ./iscsi/archive/....
You can also copy the files locally first, and then:
rsync -a --files-from=localCopyOfFileList.txt user#server:/ .
You can also manipulate that file to remove for example 2 levels:
rsync -a --files-from=localCopyOfFileList2.txt user#server:/iscsi/archive .
etc.

Remote rsync in parallel

I'm trying to run rsync over ssh in parallel to transfer files between two machines for evaluation purposes. I wanna see how faster can I get compared to a single rsync process.
I tried these two solutions:
https://wiki.ncsa.illinois.edu/display/~wglick/Parallel+Rsync but with no great success.
https://gist.github.com/rcoup/5358786 (I couldn't make it work)
Based on the first link I run a command like this:
ssh HOST "mkdir -p ~/destdir/basefolder"
cd ./basefolder; ls | xargs -n1 -P 4 -I% rsync -arvuz -e ssh % HOST:~/destdir/basefolder/.
and I get the files transfered, but it doesn't seem to work well... In this case, It will run a process for every file and folder in the basefolder, but when it finds a folder, it will transfer everything inside that folder using only 1 process.
I tried to use find -type f, but I got problems because I loose the file hierarchy.
Does anyone how some methods to do what I want? (Use rsync in parallel over ssh while keeping files and folders hierarchy).
Since you tagged your question 'gnu-parallel' the obvious is to refer you to http://www.gnu.org/software/parallel/man.html#EXAMPLE:-Parallelizing-rsync
cd src-dir; find . -type f -size +100000 | parallel -v ssh fooserver mkdir -p /dest-dir/{//}\;rsync -Havessh {} fooserver:/dest-dir/{}

copy multiple sql files to desktop at terminal linux

scp root#foo.net:/var/www/html/sites/foo.sql /Users/foo/Desktop/folder1
How can I copy multiple sql files over in one command?
If I have foo.sql; foo_1.sql; foo_2.sql
Try doing it this way , The one line below should run scp multiple times .
scp root#foo.net:/var/www/html/sites/foo.sql
scp root#foo.net:/var/www/html/sites/foo_1.sql
scp root#foo.net:/var/www/html/sites/foo_2.sql
And execute the following line.(file1,2,3 in the below command should be the abslout path to the files)
for REMOTE in "/Users/foo/Desktop/folder1" ; do scp file1 file2 file 3 $REMOTE; done
I hope you find this useful.

How do I handle spaces in a script that uses the results of find in a for loop?

I am trying to write a simple command that searches through a music directory for all files of a certain type and copies them to a new directory. I would be fine if the files didn't have spaces.
I'm using a script from the following question, but it fails for spaces:
bash script problem, find , mv tilde files created by gedit
Any suggestions on how I can handle spaces, once I'm satisfied that all the files are being listed, I'll change echo to cp
for D in `find . -name "*.wma*"`; do echo "${D}"; done
You probably want to do this instead:
find . -name *.wma -exec echo "{}" \;