server to server transfer - ssh

Need to transfer 1 file from old host with no SSH access to a new host in which I do have SSH access. Having a hard time figuring this out. Looking for a simple answer if there is one. And also trying to avoid the slow upload times from my local machine, hence the reason for server to server transfer needed.

Are you able use FTP? You could use that to transfer the files.

If you have the URL of the file you want to move from your old host, you can use the wget command in your SSH terminal. You can use this for any file extension, or folders if you want.
For example, if you want to move http://www.yourhost.com/file.zip to your new host, you would SSH into the folder you want to download move this to, and type:
wget http://www.yourhost.com/file.zip

Related

How to programmatically download a file from a remote desktop if I have the data required to configure a Jump Desktop (remote desktop) connection?

I want to programmatically download a file from a remote machine.
So, I know the host's IP and port:
Login data:
I also know that it creates an SSH tunnel.
Any suggestions? Is it even possible knowing just that data?
My knowledge on that topic is very scarce.
My answer focuses on SSH usage. In order to download a file via SSH, you need to run the scp command, like
scp yourusername#server.url:/the/path/to/the/file.extension ./
That's enough in order to download the file. However, it is possible that this will not work by itself. First, you need the other machine to know about your ssh, so you will need to
vim ~/.ssh/authorized_keys
hit insert and paste your public SSH key to its end. Don't remove anything. If it is still not working, then ssh might be disallowed on the server and you will need to enable it. Example for Ubuntu: https://linuxize.com/post/how-to-enable-ssh-on-ubuntu-18-04/
Your user needs access to the file you want to download, otherwise this won't work.
Alternatively you could set up an SFTP connection as well and use that.

Is it possible to edit code on my own machine and save it to account I've ssh'd into?

Scenario:
I'm using ssh to connect to a remote machine. I use the command line and run ssh <pathname>, which connects me to the machine at . I want to edit and run code on that remote machine. So far the only way I know is to create, edit, and run the files in the command window in vi, because my only connection to that machine is that command window.
My Question is:
I'd love to be able to edit my code in VSCode on my own machine, and then use the command line to save that file to the remote machine. Does anyone know if this is possible? I'm using OS X and ssh'ing into a Linux Fedora machine.
Thanks!
Sounds like you're looking for a command like scp. SCP stands for secure copy protocol, and it builds on top of SSH to copy files from one machine to another. So to upload your code to your server, all you'd have to do is do
scp path/to/source.file username#host:path/to/destination.file
EDIT: As #Pam Stums mentioned in a comment below the question, rsync is also a valid solution, and is definitely less tedious if you would like to automatically sync client and server directories.
You could export the directory on the remote machine using nfs or samba and mount it as a share on your local machine and then edit the files locally.
If you're happy using vim, check out netrw (it comes with most vim distributions; :help netrw for details) to let you use macvim locally to edit the remote files.

how to move files from remote server to s3 at the command line

I have a lot of big files on a remote server and I want to move them into S3. I want to do it at the command line or with a bash script (e.g., I do NOT want to use a gui app like cyberduck) so that I can automate/replicate efforts.
I have tried to mount my remote server onto my local machine using Osxfuse and sshfs and then push it to s3 using s3cmd. This does work but I keep running into errors (connection being lost for no apparent reason; mount errors, etc.).
Is this the best way to do it? Does anyone know a better way to do it?
Thanks,
A
You can use minio client aka mc to do the same.
$ mc cp --recursive localDir/ s3/remoteBucket
In case of network disconnection, mc will provide you an option to resume the upload.
Is your remote server in ec2? Your current setup requires two copies (first to pull data to your local machine via sshfs, then to push to s3 via s3cmd), if you run s3cmd on your remote server directly you can reduce that to one.
If you want to mount s3 as a filesystem, you can also use tools like goofys or s3fs. Again you should do that on your remote server to avoid extra copies.

How to read a large error log file without downloading?

I need to check my apache error log, but the file contains 14GB of data. Downloading will takes ages. Is there any way how I can see what's in the file without downloading?
If you have SSH enabled on the server, just remotely login and either tail or grep the log files.
Do you got a SSH access to your server ? In this case this is easy, use a SSH client like putty and read file directly.
If you only got a FTP access, you can try to use Byte range as defined here : http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.35
Of course only if your server supports it.

What is the fastest way to upload the big files to the server

I have got dedicated server and file about 4 GB to upload on the server. What is the fastest and most save way to upload that file to the server?
FTP may create issues if the connection will be broken.
SFTP will have the same issue as well.
Do you have your own computer available through internet public IP as well?
In that case you may try to set up a simple HTTP server (if you have Windows - just set up the IIS) and then use some download manager on dedicated server (depends from OS) to download the file through HTTP (it can use multiple streams for that) or do this through torrent.
There're trackers, like http://openbittorrent.com/, which will allow you to keep the file on your computer and then use some torrent client to upload the file to the dedicated server.
I'm not sure what OS your remote server is running but I would use wget it has a --continue from the man page:
--continue
Continue getting a partially-downloaded file. This is useful when
you want to finish up a download started by a previous instance of
Wget, or by another program. For instance:
wget -c ftp://sunsite.doc.ic.ac.uk/ls-lR.Z
If there is a file named ls-lR.Z in the current directory, Wget
will assume that it is the first portion of the remote file, and
will ask the server to continue the retrieval from an offset equal
to the length of the local file.
wget binaries are available for GNU/Linux / Windows / MacOSX / dos:
http://wget.addictivecode.org/FrequentlyAskedQuestions?action=show&redirect=Faq#download