Ansible playbboks in Codebuild - aws-codebuild

How can I run playbooks in Codebuild? The main problem is SS connection to the nodes. I am trying to use pem files for SSH connection thru SSH-agent. I have store the pem file in aws secretsmanager. But when I accessing it in coddebuild as a environmental variable it's asking passphrase. How to solve this problem..please help.

Related

Test SSH connection between remote servers with Ansible

I have a playbook that creates an SSH key in a remote serverA that then copies it over another remote serverB.
I'm looking for a way to test the SSH connection from serverA to serverB, and then maybe run some command in serverB (for example uname -a) to output it as a debug message that confirms the connection is working.
I've been looking around on the Internet and here as well, but I haven't found anything yet...
Any clue?
A quick approach would be to :
On Ansible's control node, use openssh_keypair to create an SSH
keypair. Please pay attention to the path, to make sure the existent
keypair is not overwritten.
Copy the keypair from Ansible's control node to serverA (make sure
your set the right permssions for the files), use the copy
module.
Copy the public-key (newly generated keypair) from Ansible's control
node to serverB (make sure your set the right permssions for the
file), and delete the source keypair.
Now, SSH keypairs setup is ready between serverA & serverB.
Run command module, on the serverA and register it's result e.g:
- name: Create variable from command
command: ssh -o StrictHostKeyChecking=no user#serverB 'some_command'
register: command_output
Print out the out the output of the registered result:
- debug: msg="{{ command_output.stdout }}"

VitrtualBox Connection to GitLab by using SSH keys

I'm stuck with a little ssh problem. I'm working with a Windows10 which has its pair of ssh keys generated via PuttyGen (rsa) by using domain's mail. I use this pair to connect via Ssh to my GitLab repository and all works fine.
I decided to create a Ubuntu VM via VirtualBox on the same machine, then I generated a new ssh keys pair into the VM using
ssh-keygen -t rsa -C "my.email#example.com" -b 4096
with the same mail of windows10's domain. After that I added this new public key into my GitLab account. However, when I test this new pair of keys via
ssh -Tv git#gitlab.com
where "gitlab.com" is my gitlab repository, I receive, along with some debug messages (which don't contain any useful information)
Permission denied (publickey)
Now, my question is as follows:
is there something that I should do differently as usual to setup a new pair of SSh keys into a VirtualMachine which use the same network of the Host machine? Or, theoretically, should it work fine just as I did?
Thank you
EDIT: I've also tried to copy the same VM SSH keys into my Windows machine, replacing the old one, and they works. So it's not a generation key problem, I think it's really a problem of VirtualBox or Virtualization in general, any help?

how to copy files using ssh under the key-based [pem} configuration

i have a server which is access remote connection only with SSH key auth
i have a key which is stored in my home directory with .pem extension
but when im trying to copy file using the scp command
scp /home/myfilewhichiwannatocopy core#54.32.14.156:/home/core the server asks for password but i don't have it ( btw normal connection using the ssh -i /.ssh/mg.service.pem core#54.32.14.156 fully works) and how to make the scp command for using the key auth?
scp -i /path/to/key.pem somefile.txt user#<machine>:/path
Might I also add, you can consult the man pages https://linux.die.net/man/1/scp

Jenkins won't use SSH key

I'm sorry to have to ask this question, but I feel like I've tried every answer so far on SO with no luck.
I have my local machine and my remote server. Jenkins is up and running on my server.
If I open up terminal and do something like scp /path/to/file user#server:/path/to/wherever then my ssh works fine without requiring a password
If I run this command inside of my Jenkins job I get 'Host Key Verification Failed'
So I know my SSH is working correctly the way I want, but why can't I get Jenkins to use this SSH key?
Interesting thing is, it did work fine when I first set up Jenkins and the key, then I think I restarted my local machine, or restarted Jenkins, then it stopped working. It's hard to say exactly what caused it.
I've also tried several options regarding ssh-agent and ssh-add but those don't seem to work.
I verified the local machine .pub is on the server in the /user/.ssh folder and is also in the authorized keys file. The folder is owned by user.
Any thoughts would be much appreciated and I can provide more info about my problem. Thanks!
Update:
Per Kensters suggestion I did su - jenkins, then ssh server, and it asked me to add to known hosts. So I thought this was a step in the right direction. But the same problem persisted afterward.
Something I did not notice before I can ssh server without password when using my myUsername account. But if I switch to the jenkins user, then it asks me for my password when I do ssh server.
I also tried ssh-keygen -R server as suggested to no avail.
Try
su jenkins
ssh-keyscan YOUR-HOSTNAME >> ~/.ssh/known_hosts
SSH Slaves Plugin doesn't support ECDSA. The command above should add RSA key for ssh-slave.
Host Key Verification Failed
ssh is complaining about the remote host key, not the local key that you're trying to use for authentication.
Every SSH server has a host key which is used to identify the server to the client. This helps prevent clients from connecting to servers which are impersonating the intended server. The first time you use ssh to connect to a particular host, ssh will normally prompt you to accept the remote host's host key, then store the key locally so that ssh will recognize the key in the future. The widely used OpenSSH ssh program stores known host keys in a file .ssh/known_hosts within each user's home directory.
In this case, one of two things is happening:
The user ID that Jenkins is using to run these jobs has never connected to this particular remote host before, and doesn't have the remote host's host key in its known_hosts file.
The remote host key has changed for some reason, and it no longer matches the key which is stored in the Jenkins user's known_hosts file.
You need to update the known_hosts file for the user which jenkins is using to run these ssh operations. You need to remove any old host key for this host from the file, then add the host's new host key to the file. The simplest way is to use su or sudo to become the Jenkins user, then run ssh interactively to connect to the remote server:
$ ssh server
If ssh prompts you to accept a host key, say yes, and you're done. You don't even have to finish logging in. If it prints a big scary warning that the host key has changed, run this to remove the existing host from known_hosts:
$ ssh-keygen -R server
Then rerun the ssh command.
One thing to be aware of: you can't use a passphrase when you generate a key that you're going to use with Jenkins, because it gives you no opportunity to enter such a thing (seeing as it runs automated jobs with no human intervention).

SSH 'command not found' when trying to connect to AWS

I'm new to Amazon web services and have managed to set up an instance.
I already have the ssh directory on my machine at: /usr/bin/ssh
I have also downloaded a Pem key file to my machine and have tried to copy my Pem key file into that directory but I cannot navigate to it. When I try:
cd /usr/bin/ssh
I get:
-bash: cd: /usr/bin/ssh: Not a directory
When I just try to type the command:
ssh
I get the following:
BEGIN: command not found
: command not found2: MIIEpAIBAAKCAQEAu6JORnapcVdvAwPm+6LVBA3n8chlGU4nE0g9nyD8zSDWlATJpf1Td35tPrxj
: No such file or directory
can anyone help with this?
I'm on OSX Lion 10.8.4 if that helps!
Your problem appears to be related to configuring the ssh keys. First, some clarifications:
/usr/bin/ssh is not a directory, it is the actual secure shell program. Do not modify it. (If you have already destroyed your ssh installation, you would need to restore the installation: http://support.apple.com/kb/PH10763).
ssh will use a public and a private key (keypair) to authenticate. The private key should be stored locally on your computer, generally in the .ssh folder inside your home directory (~/.ssh)
You may have generated the keypair yourself, or have gotten one generated by AWS.
I will assume your .pem file is the private key portion of the keypair, and that you have downloaded that from AWS after following a procedure along the lines of: http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/how-to-have-aws-create-the-key-pair-for-you.html.
In that case, you need to simply:
copy the .pem file into your ~/.ssh directory.
rename it to id_rsa
ensure that you have correctly set permissions for the private key and .ssh directory (ssh is picky), typically 600 for the id_rsa file and 700 for the .ssh directory.
initiate the ssh connection via ssh username#host