scp root#foo.net:/var/www/html/sites/foo.sql /Users/foo/Desktop/folder1
How can I copy multiple sql files over in one command?
If I have foo.sql; foo_1.sql; foo_2.sql
Try doing it this way , The one line below should run scp multiple times .
scp root#foo.net:/var/www/html/sites/foo.sql
scp root#foo.net:/var/www/html/sites/foo_1.sql
scp root#foo.net:/var/www/html/sites/foo_2.sql
And execute the following line.(file1,2,3 in the below command should be the abslout path to the files)
for REMOTE in "/Users/foo/Desktop/folder1" ; do scp file1 file2 file 3 $REMOTE; done
I hope you find this useful.
Related
Assuming there are folders and files like below
/master1/dir1/dir1_1/file1
/master1/dir1/dir1_1/file2
/master1/dir1/dir1_1/file2
/master1/dir1/dir1_1/file2
/master1/dir1/dir1_1/file2
/master1/dir1/dir1_2/file1
/master1/dir1/dir1_2/file2
/master1/dir1/dir1_2/file2
/master1/dir1/dir1_2/file2
/master1/dir1/dir1_2/file2
/master1/dir2/dir2_1/file1
/master1/dir2/dir2_1/file2
/master1/dir2/dir2_1/file2
/master1/dir2/dir2_1/file2
/master1/dir2/dir2_1/file2
/master1/dir2/dir2_2/file1
/master1/dir2/dir2_2/file2
/master1/dir2/dir2_2/file2
/master1/dir2/dir2_2/file2
/master1/dir2/dir2_2/file2
I have a terminal server from where password-less authentication is enabled for all the servers in my landscape and I would like to transfer files from one server to another.
When I run SCP with -3 option on the terminal server I was able to copy the files from one server to another and now I have to exclude certain files with file name file1 or under directory dir1_1
I am using the below command inside a Perl script to copy all the files
scp -o StrictHostKeyChecking=no -3 -v -r $srcHostname:$srcPath/* $tgtHostname:$tgtPath
In rsync I was unable to get a similar option like (scp -3) so i need to use only scp
how do i exclude a file pattern in SCP
I use this command single line to copy a folder named 'myFolder1' from remote server to remote server. It works fine.
I run this command in the 'terminal' of 'myserver2'. This is the destination server, I mean the server the folder will be copied to.
scp -r myserver#190.93.133.6:/home/myserver/www/wp-content/plugins/myFolder1 .
If I need to copy two folders (instead of one)I need to run my command two times (one for each folder) like this:
scp -r myserver#190.93.133.6:/home/myserver/www/wp-content/plugins/myFolder1 .
scp -r myserver#190.93.133.6:/home/myserver/www/wp-content/plugins/myFolder2 .
My question: is there a way to join these two commands into a single command line?
Yes, there is. Just use the wildcard character * and the quotes ".
Here an example:
scp -r "myserver#190.93.133.6:/home/myserver/www/wp-content/plugins/myFolder*" .
But you can also be more precise using other wildcard characters:
scp -r "myserver#190.93.133.6:/home/myserver/www/wp-content/plugins/myFolder{1,2}" .
💡 Note the quotes used to wrap the path and the wildcard.
The simplest solution is:
scp -r myserver#190.93.133.6:/home/myserver/www/wp-content/plugins/myFolder{1,2} .
An asterick definitely works here, but it matches more than 1 and 2 thus may cause unwanted result. Note that {a,b,c,d} only works on remote path. So if you want to copy from local to remote server, use this instead:
scp -r myFolder1 myFolder2 user#host:/path/
I have about a thousand files on a remote server (all in different directories). I would like to scp them to my local machine. I would not want to run scp command a thousand times in a row, so I have created a text file with a list of file locations on the remote server. It is a simple text file with a path on each line like below:
...
/iscsi/archive/aat/2005/20050801/A/RUN0010.FTS
/iscsi/archive/aat/2006/20060201/A/RUN0062.FTS
/iscsi/archive/aat/2013/20130923/B/RUN0010.FTS
/iscsi/archive/aat/2009/20090709/A/RUN1500.FTS
...
I have searched and found someone trying to do a similar but not the same thing here. The command I would like to edit is below:
cat /location/file.txt | xargs -i scp {} user#server:/location
In my case I need something like:
cat fileList.txt | xargs -i scp user#server:{} .
To download files from a remote server using the list in fileList.txt located in the same directory I run this command from.
When I run this I get an error: xargs: illegal option -- i
How can I get this command to work?
Thanks,
Aina.
You get this error xargs: illegal option -- i because -i was deprecated. Use -I {} instead (you could also use a different replace string but {} is fine).
If the list is remote, the files are remote, you can do this to retrieve it locally and use it with xargs -I {}:
ssh user#server cat fileList.txt | xargs -I {} scp user#server:{} .
But this creates N+1 connections, and more importantly this copies all remote files (scattered in different directories you said) to the same local directory. Probably not what you want.
So, in order to recreate a similar hierarchy locally, let's say everything under /iscsi/archive/aat, you can:
use cut -d/ to extract the part you want to be identical on both sides
use a subshell to create the command that creates the target directory and copies the file there
Thus:
ssh user#server cat fileList.txt \
| cut -d/ -f4- \
| xargs -I {} sh -c 'mkdir -p $(dirname {}); scp user#server:/iscsi/archive/{} ./{}'
Should work, but that's starting to look messy, and you still have N+1 connections, so now rsync looks like a better option. If you have passwordless ssh connection, this should work:
rsync -a --files-from=<(ssh user#server cat fileList.txt) user#server:/ .
The leading / is stripped by rsync and in the end you'll get everything under ./iscsi/archive/....
You can also copy the files locally first, and then:
rsync -a --files-from=localCopyOfFileList.txt user#server:/ .
You can also manipulate that file to remove for example 2 levels:
rsync -a --files-from=localCopyOfFileList2.txt user#server:/iscsi/archive .
etc.
I have a specific list of files that I need to copy from a remote server. Is this possible with SCP?
I know I can copy individual files using scp {user_name}#{host}:{filepath} . , but is there any way to take a .csv or .txt and run a foreach loop?
while read file; do scp "user#host:$file" .; done < files
I found that it was easier to use tar with a list then send the files individually via scp:
tar -czvf archive.tar.gz -T file-list.txt && scp archive.tar.gz user#host:/path/
I use this on systems that don't have rsync available, this way you also avoid iterating password prompts or TCP/SSH connection limits.
Whenever I try to SCP files (in bash), they end up in a seemingly random(?) order.
I've found a simple but not-very-elegant way of keeping a desired order, described below. Is there a clever way of doing it?
Edit: deleted my early solution from here, cleaned, adapted using other suggestions, and added as an answer below.
To send files from a local machine (e.g. your laptop) to a remote (e.g. your calculation server), you can use Merlin2011's clever solution:
Go into the folder in your local machine where you want to copy files from.
Execute the scp command, assuming you have an access key for the remote server:
scp -r $(ls -rt) user#foo.bar:/where/you/want/them/.
Note: if you don't have a public access key it may be better to do something similar using tar, then send the tar file, i.e. tar -zcvf files.tar.gz $(ls -rt), and then send that tar file on its own using scp.
But to do it the other way around you might not be able to run the scp command directly from the remote server to send files to, say, your laptop. Instead, you may need to, let's say bring files into your laptop. My brute-force solution is:
In the remote server, cd into the folder you want to copy files from.
Create a list of the files in the order you want. For example, for reverse order of creation (most recent copied last):
ls -rt > ../filenames.txt
Now you need to add the path to each file name. Before you go up to the directory where the list is, print the path using pwd. Now do go up: cd ..
You now need to add this path to each file name in the list. There are many ways to do this, here's one using awk:
cat filenames.txt | awk '{print "path/to/files/" $0}' > delete_me.txt
You need the filenames to be in the same line, separated by a space, so change newlines to spaces:
tr '\n' ' ' < delete_me.txt > filenames.txt
Get filenames.txt to the local server, and put it in the folder where you want to copy the files into.
The scp run would be:
scp -r user#foo.bar:"$(cat filenames.txt)" .
Similarly, this assumes you have a private access key, otherwise it's much simpler to tar the file in the remote, and bring that.
One can achieve file transfer with alphabetical order using rsync:
rsync -P -e ssh -r user#remote_host:/some_path local_path
P allows partial downloading, e sets the SSH protocol and r downloads recursively.
You can do it in one line without an intermediate using xargs:
ls -r <directory> | xargs -I {} scp <Directory>/{} user#foo.bar:folder/
Of course, this would require you to type your password multiple times if you do not have public key authentication.
You can also use cd and still skip the intermediate file.
cd <directory>
scp $(ls -r) user#foo.bar:folder/