Multi stream SCP to transfer large amount of small files from EC2 [closed] - ssh

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
I am using scp to download millions of small files (100 - 1000 kb) files from my EC2 instances. scp seems to transfer one file at a time and does not utilize fully my 1 gbps connection.
Is there a more efficient way to download the files? For various technical reasons, achieving and downloading is not an option.

Take a look at rsync. It can also work through ssh.

If you are still able to use tar, but not able to create a tarball on the remote host, you can try something like:
ssh ec2instance "tar c /path/to/source" | tar x -C /path/to/destination
You can use the v option to tar, or the pipe viewer to get feedback on the transfer.
If the above is not an option either, try running several (a dozen) scp in parallel to reduce the effect of the overhead induced by many small files.
(Also make sure that the filesystem is not the bottleneck.)

Related

Rclone Compression | zip | rar and data transfer [closed]

Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 4 months ago.
Improve this question
I have been using rclone to back up google drive data to AWS S3 cloud storage. I have multiple google drive accounts whose backup happens on AWS S3. All those google drives have different numbers of documents.
I want to compress those documents into a single zip file and then it needs to be copied on S3.
Is there any way to achieve the same?
I referred to the link below, but it doesn't have complete steps to accomplish the task.
https://rclone.org/compress/
Any suggestion would be appreciated.
Rclone can't compress the files, but you can instead use a simple code to zip or rar the files and then use rclone to back them up to AWS.
If this is OK, I can explain the details here.

Cloud bucket virus security [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
I have implemented an antivirus system using ClamAV on one of my apps which uses Google cloud storage for uploading files.
Currently what I am doing is, listening to bucket upload, download it on one of my servers, scan it using ClamAV, and deleting it if it was infected.
I am a newbie to this, Is it possible that the whole cloud bucket gets infected by a virus on upload only.
i.e, can a virus execute himself on the bucket(any cloud bucket) itself?
If yes then please suggest some other solution to solve this issue as my current solution would be ineffective in this case.
Object Storage systems do not provide an execution framework hence an infected file cannot infect other files in the bucket.

How to recover a files which are encrypted by ransomware [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 2 years ago.
Improve this question
As I'm using a windows 7.We are configured public ip to access our apache server. After a Remote desktop connection via Anydesk Ended. All the Files in System are encrypted with .deep extension. Every Folder Contains Text files that has id and mrdeep#protonmail.com and bitcoin donate information and also system already has K7 AntiSecurity(Licensed but about to end).But after affected, K7 automatically vanished. Hence can anyone help me to recover those files and also guide me to escape from such viruses in future..
it looks that this is Scarab ransomware new version. Scarab has decryption tool developed by Dr. Web. You should try that, it may work for this .deep extension or just use anti-malware tools, clean everything. Then maybe Data Recovery Pro will help with data decryption. Keep your anti-virus and anti-malware up-to-date so you can avoid this nasty virus. These most commonly come from infected spam email attachments. so clean that box more often and scan your system more frequently. Good luck!

Unspecified error during copy process [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 6 years ago.
Improve this question
has anyone made experience with a unspecified error during copy big (2-3GB) files to the local hyper-v machine? My machine has in any case enough memory and space. The error comes quite suddenly during the copy process. Where can I get specific error details? There is nothing in my Windows Event logger :-/
Best regards
Copying files larger than 2 GB over a Remote Desktop Services or a Terminal Services session through Clipboard Redirection (copy and paste) is not supported. Please check
https://support.microsoft.com/en-us/kb/2258090
RDP transfers files through your clipboard. Copying something while the transfer is taking place (which is particularly easy in a large file transfer, because you probably want to do something else as it is happening, and copying things is a very common operation) fails the entire thing with an unspecified error.
If it's possible to you, consider using drive redirection, as it is much faster and resilient to this.
It happens for me when I try to copy some other text or file when a large file is being copied over Remote desktop services whereas in local machine, it allows parallel copy.
This helped for me:
Log into the Remote Desktop.
Browse to your computer over the network.
Copy file to the Remote Desktop.

Rsync slow over internet [closed]

Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 8 years ago.
Improve this question
Rsync took over 10 hours today to transfer a 1GB file over the internet (from one Raspberry Pi to another. Are there any ways to speed this up?
rsync uses ssh to transfer files. Therefore what you want to do is speed up ssh. You can do that by changing the encryption method to a faster one such as arcfour or blowfish. You can do this by using the -e flag. For example
rsync -avt -e "ssh -c blowfish" user#dest:/remote/path /local/path
Personally I use blowfish but here is a benchmark I found real quick. Keep in mind this isn't going to make rsync super fast all of a sudden, but it could help if the bottleneck is the CPU on either side which is likely with embedded machines. Also keep in mind that your build of ssh might not have all the ciphers you see used elsewhere.