scp the latest file of a remote folder - scp

I know, the subject has been covered several times, but I anyhow can't make it work.
I am a very unexperienced web developer.
I created several jobs which :
1) save the sql dump into a folder on the server
2) download the content of the folder on my locale machine. (SCP)
Problem is, the way I created the 2) job, all the files which are existing in 1) are transfered to my local machine. I am looking for a way to only scp the latest created file. I tried this possibility, that several person qualify as the best answer (ls -t nlb* | head -1) but anyhow get an error message.
Any chance someone could simply correct the command ?
scp -i /users/myname/desktop/projects/online/mykey.pem ec2-user#XXXXXXXXXX.eu-west-1.compute.amazonaws.com:/var/www/dbbckp/`* | head -1` /users/myname/downloads/dbbckp
Many thanks in advance.

You can either use rsync which sync the gap only, or you can name your dump files in a proper way so you know which one to copy.

Related

Is it possible to work with an editor remotly?

I want to edit files on my server, but don't want to always upload the files, is there a way on to edit the files remotly?
I tryed to make an bash script which should upload the files, but its not realy good :/ and sometimes didn't worked
You can use Atom with the
Retome FTP edit
Package
If you have got an SSH (SFTP enabled) access to your server, an option would be use use SSHFS to mount a remote directory on your local working path.
In this way you can use any editor (or even something else) to change your files and they will always be synchronised.
Just keep in mind that, in this way, files are actually located on your server, you won't have a real copy on your local machine.

Getting a PDF out of the SSH to the own system

Given:
Connection to the Uni's secure shell like this:
me#my_computer~$ ssh <my_name>#unixyz.cs.xy.com
Password:***********
Welcome to Unixyz. You now can access a terminal on system unixyz:
my_name#unixyz~$ ls
Desktop Documents Pictures Music desired_document.pdf
my_name#unixyz-$
Taks/Question:
Getting the desired_document.pdf to my own system. I have thought of some options so far:
1)Since i can access an editor like nano I could write a C/Java programm , compile it in the home directory and make that program send the pdf. Problem with that: Had to code a client on the Uni machine and a server on my own system. On top of that I only know how to transfer text given to the stdin and no pdf's. And its obviously too much work for the given task
2) I found some vague information about commands: scp and sftp. Unfortunately, I can not figure out how it is done exactly.
The latter is basicly my questions: Are the commands scp and sftp valid options for doing the desired and how are they used?
EDIT:
I received a first answer and the problem persists: As stated, i use:
scp me# server.cs.xyz.com:/path/topdf /some/local/dir
which gives me:
/some/local/dir: no such file or directory
I'm not sure in which environment you are.
Do you use Linux or Windows as your every-day operating system?
If you are using windows, there are some ui-based scp/ssh implementations that enable you to transfer these files using an explorer based ui.
For example there is https://winscp.net/
You can indeed use scp to do exacty that, and it's easier than it might look:
scp your_username# unixyz.cs.xy.com:path/to/desired_document.pdf /some/local/dir
The key is the colon after the servername where you add your path
Optionally you can pass in the password as well, but that's bad practice, for obvious reasons.
I actually got the answer myself and the error that I was having. Both, the guy with the answer and the commentor where right. BUT:
scp must be launched when you are in YOUR terminal, I always tried to do it while I was connected to the remote server.
2 hours wasted because of that.

Why is my rsync command not working between Openshift gears?

I'm trying - like many other people - to rsync data between gears of a scalable app.
I have created a minutely cron file that gets executed and can get and parse a list of gears. It can then determine which ones are NOT the local gear and then tries to rsync the app-root/data folder of each gear to the $OPENSHIFT_DATA_DIR (should be on local gear)...
The first part works, but the rsync command fails with:
rsync: connection unexpectedly closed (0 bytes received so far) [receiver]
I have tried about a dozen variants of
rsync -avz -e "ssh -i ${OPENSHIFT_DATA_DIR}.ssh/id_rsa" $targetgear#app-user.rhcloud.com:app-root/data/test/ ${OPENSHIFT_DATA_DIR}test
$targetgear is a variable from parsing the list of gears - it's the long string of "stuff" in the gear name... the script loops through the csv list provided by haproxy and correctly writes what it finds to a log file, so I know it's getting at least that far...
The id_rsa file exists and has been added to the account in the web control panel.
Any help or pointers would be greatly appreciated. I have combed pages and pages of forum posts and documentation on rsync and ssh, but can't seem to get around it...
Thanks in advance.
THANK YOU RAUL!!!
Ok... The problem was the url for the target gear - I needed the long string of gunk TWICE:
$targetgear#$targetgear-user.rhcloud.com ...
and not
$targetgear#app-user.rhcloud.com
The app name prefix in the url is replaced by the gear ID.

link folders via ssh

I would like to do the following simple thing:
When a folder is referenced in a web site (i.e. href='folder1/pic.jpg'), I want the server to actually look in another folder (i.e. 'folder2'), where the actual 'pic.jpg' will be.
I believe this can be done by connecting to the server via SSH and then setting something up there, but I don't know what.
Could anyone give me an example?
Thanks!
You can try going into your document root (cd) and executing
ln -s folder1 folder2
I think this is the thing in ssh you're talking about. Alternatively, you can edit server config there, but that requires more input.

Using Bazaar to handle Website Versioning

I imagine this is a pretty basic question but I haven't been able to find an answer anywhere.
I develop websites. In the past I've handled all the live files manually and it stinks, of course. I've been hoping Bazaar could add some power and organization to the way we work.
Right now, I work with a local server on my laptop and want to gracefully push data onto the live server. Currently, I'm doing the following:
Local machine:
bzr push sftp://user#server/path/to/project/BZR/live
On server:
rm -r /path/to/project/live
bzr branch /path/to/project/BZR/live
Is there anyway to get the Local files live from the push?
Otherwise, is a branch to the live path correct?
Is there anyway to get Bazaar to just update changed files in the live path so that I don't have to delete /live each time?
Right now I have to manually edit .htaccess with each upload. If I didn't have to delete /live, I imagine I could tell bzr to ignore it and all would take care of itself.
Thanks for your help!
-Nicky
Check bzr-upload plugin, and also push-and-update plugin.