I try to import an existing folder in a "remote server" on my "local agent Bamboo".
I tried to create an SCP Task in Bamboo, but it does not work, since the SCP task in Bamboo helps to do the opposite (from local to remote).
I try to create a Script Task with Bamboo to import the folder.
I await your help.
Thank you in advance.
Using Script Task is right way (or at least working one) to implement "reverse" SCP copy from remote host to Bamboo CI server.
In Script configuration define your scp command as usual:
/usr/bin/scp -P remote.host.port user#remote.host:/path/to/source_file bamboo.host:/path/to/target_file
For correct work don't forget to setup SSH public key authentication for password-less login.
Another option is to use custom bamboo-ssh-plugin which provides additional Reverse SCP Task:
which can be configured and used completely same way as default SCP Task.
Related
Scenario:
I'm using ssh to connect to a remote machine. I use the command line and run ssh <pathname>, which connects me to the machine at . I want to edit and run code on that remote machine. So far the only way I know is to create, edit, and run the files in the command window in vi, because my only connection to that machine is that command window.
My Question is:
I'd love to be able to edit my code in VSCode on my own machine, and then use the command line to save that file to the remote machine. Does anyone know if this is possible? I'm using OS X and ssh'ing into a Linux Fedora machine.
Thanks!
Sounds like you're looking for a command like scp. SCP stands for secure copy protocol, and it builds on top of SSH to copy files from one machine to another. So to upload your code to your server, all you'd have to do is do
scp path/to/source.file username#host:path/to/destination.file
EDIT: As #Pam Stums mentioned in a comment below the question, rsync is also a valid solution, and is definitely less tedious if you would like to automatically sync client and server directories.
You could export the directory on the remote machine using nfs or samba and mount it as a share on your local machine and then edit the files locally.
If you're happy using vim, check out netrw (it comes with most vim distributions; :help netrw for details) to let you use macvim locally to edit the remote files.
I am using SSH SFTP Sampler for SFTP testing in jmeter. I am able to GET/REMOVE/IS files/folders from SFTP locations, PUT files from local locations to SFTP. But the issue is that I'm not able to Move files between SFTP locations! Can someone advise on this please!
I tried this but this is only for FTP and not SFTP:
` import org.apache.commons.net.ftp.FTPClient;
FTPClient client = new FTPClient();
client.connect("SFTP server");
client.login("username", "password");
client.rename("location2/file.text", "location1/file.txt");
client.logout();
client.disconnect();`
FTP and SFTP are totally different beasts and they use different protocols under the hood.
Given there is no "move" or "rename" action support in the SSH SFTP you can still do the same using SSH Command sampler. The relevant configuration would be something like:
More information: How to Run External Commands and Programs Locally and Remotely from JMeter
You can install SSH Command sampler as a part of SSH Protocol Support bundle using JMeter Plugins Manager
I am provisioning a nixos instance on AWS. The instance has to download a repositiory from a private github repo. Currently I just run a shell script on the remote box using ssh-forwarding to download the repository. In this way I don't have to copy my private key, which gives me access to the repo, to the remote box.
I would like to change this procedure to be more Nix-like. I want to write a nix expression which downloads the repo and put it in /etc/nixos/configuration.nix. At the same time I don't want to copy my private key to the remote machine. Is this possible? Can nixos-rebuild use ssh forwarding?
You can explore --build-host and --target-host options of nixos-rebuild command. That is, make your local machine a build-machine, and remote one - target. You need root passwordless ssh access to remote though.
I set up an OpenShift application and set up my local PuTTY to connect to the server via SSH. Everything works fine, but I don't know how to run a few commands (mainly alias) after I connected to the server automatically (I don't want to copy&paste the same commands everytime I connect).
On my local linux shell I can use .bashrc, but this doesn't seem to work on OpenShift. I can't write a file in my home directory (/var/lib/openshift/[some letters and numbers]/) and I don't know the right place to put this file. Does anybody know where I have to put a file which will be run everytime I login?
I'd prefer a solution which doesn't involve my local SSH software as I'm connecting to this OpenShift application from different machines.
You can use your .bash_profile located in your $OPENSHIFT_DATA_DIR.
This has to be done in .bashrc or .profile or .bash_profile files. As you say they don't work then you can have a script in a file, scp that file to the remote server and then run when you ssh in a single command.
I have not used openshift but have used aws ec2 instances alot with ruby scripts,
ssh ubuntu#ec2-address ruby basic-auto.rb
The above command excutes the ruby file after the ssh. You can have a script in any language or may be a bash file(.sh) which executes after ssh.
Jenkins keeps using the default "jenkins" user when executing builds. My build requires a number of SSH calls. However these SSH calls fails with Host verification exceptions because i haven't been able connect place the public key for this user on the target server.
I don't know where the default "jenkins" user is configured and therefore cant generate the required public key to place on the target server.
Any suggestions for either;
A way to force Jenkins to use a user i define
A way to enable SSH for the default Jenkins user
Fetch the password for the default 'jenkins' user
Ideally I would like to be able do both both any help greatly appreciated.
Solution: I was able access the default Jenkins user with an SSH request from the target server. Once i was logged in as the jenkins user i was able generate the public/private RSA keys which then allowed for password free access between servers
Because when having numerous slave machine it could be hard to anticipate on which of them build will be executed, rather then explicitly calling ssh I highly suggest using existing Jenkins plug-ins for SSH executing a remote commands:
Publish Over SSH - execute SSH commands or transfer files over SCP/SFTP.
SSH - execute SSH commands.
The default 'jenkins' user is the system user running your jenkins instance (master or slave). Depending on your installation this user can have been generated either by the install scripts (deb/rpm/pkg etc), or manually by your administrator. It may or may not be called 'jenkins'.
To find out under what user your jenkins instance is running, open the http://$JENKINS_SERVER/systemInfo, available from your Manage Jenkins menu.
There you will find your user.home and user.name. E.g. in my case on a Mac OS X master:
user.home /Users/Shared/Jenkins/Home/
user.name jenkins
Once you have that information you will need to log onto that jenkins server as the user running jenkins and ssh into those remote servers to accept the ssh fingerprints.
An alternative (that I've never tried) would be to use a custom jenkins job to accept those fingerprints by for example running the following command in a SSH build task:
ssh -o "StrictHostKeyChecking no" your_remote_server
This last tip is of course completely unacceptable from a pure security point of view :)
So one might make a "job" which writes the host keys as a constant, like:
echo "....." > ~/.ssh/known_hosts
just fill the dots from ssh-keyscan -t rsa {ip}, after you verify it.
That's correct, pipeline jobs will normally use the user jenkins, which means that SSH access needs to be given for this account for it work in the pipeline jobs. People have all sorts of complex build environments so it seems like a fair requirement.
As stated in one of the answers, each individual configuration could be different, so check under "System Information" or similar, in "Manage Jenkins" on the web UI. There should be a user.home and a user.name for the home directory and the username respectively. On my CentOS installation these are "/var/lib/jenkins/" and "jenkins".
The first thing to do is to get a shell access as user jenkins in our case. Because this is an auto-generated service account, the shell is not enabled by default. Assuming you can log in as root or preferably some other user (in which case you'll need to prepend sudo) switch to jenkins as follows:
su -s /bin/bash jenkins
Now you can verify that it's really jenkins and that you entered the right home directory:
whoami
echo $HOME
If these don't match what you see in the configuration, do not proceed.
All is good so far, let's check what keys we already have:
ls -lah ~/.ssh
There may only be keys created with the hostname. See if you can use them:
ssh-copy-id user#host_ip_address
If there's an error, you may need to generate new keys:
ssh-keygen
Accept the default values, and no passphrase, if it prompts you to add the new keys to the home directory, without overwriting existing ones. Now you can run ssh-copy-id again.
It's a good idea to test it with something like
ssh user#host_ip_address ls
If it works, so should ssh, scp, rsync etc. in the Jenkins jobs. Otherwise, check the console output to see the error messages and try those exact commands on the shell as done above.