"Host key verification failed" error when running GitHub Actions on self-hosted runner (Windows 10) - authentication

I'm trying to run a simple GitHub Action on my self-hosted runner (Windows 10), but I'm getting the error Host key verification failed. [error]fatal: Could not read from remote repository. Here's the code for the GitHub Action:
name: GitHub Actions Demo
on:
push:
branches: ["feature"]
jobs:
build:
runs-on: self-hosted
steps:
- name: Check out repository code
uses: actions/checkout#v3
I've verified that the self-hosted runner is properly configured and connected to the repository, and I can manually clone and fetch the repository on the same machine without any issues. I've also tried running the ssh-keyscan command and adding the resulting host key to the known_hosts file, but that doesn't solve the problem.

Instead of running directly the checkout action, try first running a
steps:
- name: Test SSH access
run: ssh -Tv git#github.com
The is to see which key is communicated, and it the account used is the same as the one you are with, when you do your manual test (when the clone/fetch is working).
The OP ysief-001 then sees (in the comments)
After 1h30m I cancelled the workflow.
The last two lines are
Found key in C:\\Users\\ysief/.ssh/known_hosts:4
read_passphrase: can't open /dev/tty: No such file or directory
That simply means a passphrase-protect (IE: encrypted) private key is not supported.
You need one without passphrase. (Or you can remove the passphrase from your existing key)

Related

Gitlab CI/CD using ssh / knownhosts error

I'm trying to use gitlab CI/CD to auto deploy my code, after push on an specific branch (in my case 'staging' branch)
after push on 'staging' branch I see following error on jobs section in gitlab UI:
Running with gitlab-runner 15.0.0 (xxxxxx)
on deploy xxxxxx
Preparing the "ssh" executor
00:36
Using SSH executor...
ERROR: Preparation failed: ssh command Connect() error: ssh Dial() error: ssh: handshake failed: knownhosts: key is unknown
I can see gitlab from my VM and gitlab-runner registered successfully before.
I've also created ssh key and add it to gitlab-runner installation steps.
You need to check what SSH URL is used in your case.
Something like git#gitlab.com:me/myProject would look for gitlab.com SSH host keys fingerprints in an ~/.ssh/known_hosts file.
Make sure to add first in gitlab-runner server the following to ~/.ssh/known_hosts:
gitlab.com ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAfuCHKVTjquxvt6CM6tdG4SLp1Btn/nOeHHE5UOzRdf
gitlab.com ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCsj2bNKTBSpIYDEGk9KxsGh3mySTRgMtXL583qmBpzeQ+jqCMRgBqB98u3z++J1sKlXHWfM9dyhSevkMwSbhoR8XIq/U0tCNyokEi/ueaBMCvbcTHhO7FcwzY92WK4Yt0aGROY5qX2UKSeOvuP4D6TPqKF1onrSzH9bx9XUf2lEdWT/ia1NEKjunUqu1xOB/StKDHMoX4/OKyIzuS0q/T1zOATthvasJFoPrAjkohTyaDUz2LN5JoH839hViyEG82yB+MjcFV5MU3N1l1QL3cVUCh93xSaua1N85qivl+siMkPGbO5xR/En4iEY6K2XPASUEMaieWVNTRCtJ4S8H+9
gitlab.com ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBFSMqzJeV9rUzU4kWitGjeR4PWSa29SPqJ1fVkhtj3Hw9xjLVXVYrU9QlYWrOLXBpQ6KWjbjTDTdDkoohFzgbEY=
That will skip manual fingerprint confirmation in SSH.
In other words, no more "knownhosts: key is unknown".
Note that with GitLab 15.3 (August 2022), you will have an easier time finding those:
New links to SSH fingerprints
Your GitLab SSH fingerprints are now easier to find, thanks to new links on the SSH configuration page and in the documentation.
Thank you Andreas Deicha for your contribution!
See Documentation and Issue.
For people who still encounter this issue: in our case the cause was a difference between the host name in the known_host file and the one in the toml file. They must be both fully qualified or both non qualified.

rundeck SSH Authentication failure

I run Rundeck v4.1.2, using docker-compose.
I have created a test key pair. I have entered the private key into key storage under the path keys/test using the GUI, and configured the target node to require it for SSH access. I have added the public key under /home/rundeck/.ssh/authorized_keys on the target node.
The resources.xml file looks like this:
server18:
nodename: server18
hostname: server18.rc-group.local
osVersion: 18.04
osFamily: unix
osArch: amd64
description: target-test
osName: Ubuntu
username: rundeck
ssh-authentication: privateKey
ssh-privateKey-storage-path: keys/test
When I try to connect using command line SSH and the same private key, it works fine. So the key is fine, and the target node config is fine.
When, in the GUI, I try to run the "hostname" command on the same target node, I get:
Failed: AuthenticationFailure: Authentication failure connecting to node: "server18". Make sure your resource definitions and credentials are up to date.
Can someone spot what I'm missing?
Use ssh-key-storage-path attribute instead of ssh-privateKey-storage-path in your node definition, you can see the valid attributes here.

connect bitbucket pipeline to cpanel with API keys

How do I use SSH Keys (created from cPanel) to connect to the server? And eventually pull a fresh copy and run composer updates and database migrations (a Symfony script)
I get permission denied errors so my ssh example.net.au ls -l /staging.example.net.au is reaching the server, I'm just unsure how to use keys made from cPanel to make an authentication.
bitbucket-pipelines.yml
# This is an example Starter pipeline configuration
# Use a skeleton to build, test and deploy using manual and parallel steps
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: atlassian/default-image:2
pipelines:
default:
- parallel:
- step:
name: 'Build and Test'
script:
- echo "Your build and test goes here..."
- step:
name: 'Lint'
script:
- echo "Your linting goes here..."
- step:
name: 'Security scan'
script:
- echo "Your security scan goes here..."
# The following deployment steps will be executed for each pipeline run. To configure your steps and conditionally deploy see https://support.atlassian.com/bitbucket-cloud/docs/configure-bitbucket-pipelinesyml/
- step:
name: 'Deployment to Staging'
deployment: staging
script:
- echo "Your deployment to staging script goes here..."
- echo $TESTVAR
- ssh example.net.au ls -l /staging.example.net.au
- step:
name: 'Deployment to Production'
deployment: production
trigger: 'manual'
script:
- echo "Your deployment to production script goes here..."
I think your SSL set-up may be incorrect. Please try the following to ensure both servers trust each other:
==Part 1==
Step 1. SSH into cPanel server (use PuTTY or your preferred SSH client), and run the following commands to generate a new key:
ssh-keygen
eval $(ssh-agent)
ssh-add
cat ~/.ssh/id_rsa.pub
Step 2. Copy the resulting key from the 'cat' command above, into: Bitbucket -> your repo -> Settings -> Access keys
==Part 2==
Step 3. In Bitbucket, go to your repo -> settings -> SSH keys -> Generate key
Step 4. Back on your cPanel server's SSH connection, copy the key from Step 3 above into the authorized keys file. Save when you are done:
nano ~/.ssh/authorized_keys
Right click to paste (usually)
CNRL+O to save
CNRL+X to exit
Step 5. In the same Bitbucket screen from Step 3, fetch and add host's fingerprint. You will need to enter the URL or IP address of your cPanel server here. Some cPanels servers use non-default ports. If port 22 is not the correct port, be sure to specify like so:
example.com:2200
(Port 443 is usually reserved for HTTPS and it is unlikely the correct port for an SSH connection. If in doubt, try the default 22 and common alternative 2200 ports first.)
Let me know if you have any questions and I am happy to assist you further.

Unable to connect from bitbucket pipelines to shared hosting via ssh

What I need to do is to SSH public server (which is shared hosting) and run a script that starts the deployment process.
I followed what's written here:
I've created a key pair in Settings > Pipelines > SSH Keys
Then I've added the IP address of the remote server
Then I've appended the public key to the remote server's ~/.ssh/authorized_keys file
When I try to run this pipeline:
image: img-name
pipelines:
branches:
staging:
- step:
deployment: Staging
script:
- ssh remote_username#remote_ip:port ls -l
I have the following error:
Could not resolve hostname remote_ip:port: Name or service not known
Please help!
The SSH command doesn't take the ip:port syntax. You'll need to use a different format:
ssh -p port user#remote_ip "command"
(This assumes that your remote_ip is publicly-accessible, of course.)

Basic Delivery Using SSH In Bitbucket Pipelines

Here's what I've got so far:
I've generated an SSH key pair inside my repo and also added the public key to my ~/.ssh/authorized_keys on the remote host.
My remote host has root user and password login disabled for security. I put the SSH username I use to log in manually inside an environment variable called SSH_USERNAME.
Here's where I'm just not sure what to do. How should I fill out my bitbucket-pipelines.yml?
Here is the raw contents of that file... What should I add?
# This is a sample build configuration for JavaScript.
# Check our guides at https://confluence.atlassian.com/x/14UWN for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: samueldebruyn/debian-git
pipelines:
branches:
master:
- step:
script: # Modify the commands below to build your repository.
- sftp $FTP_USERNAME#192.241.216.482
First of all: you should not add a key pair to your repo. Credentials should never be in a repo.
Defining the username as an environment variable is a good idea. You should do the same with the private key of your keypair. (But you have to Bas64-encode it – see Bb Pipelines documentation – and mark it as secure, so it is not visible in the repo settings.)
Then, before you actually want to connect, you have to make sure the private key (of course, Base64-decoded) is known to your pipeline’s SSH setup.
Basically, what you need to do in your script (either directly or in a shell script) is:
- echo "$SSH_PRIVATE_KEY" | base64 --decode > ~/.ssh/id_rsa
- chmod go-r ~/.ssh/id_rsa
BTW, I’d suggest also putting also the host’s IP in an env variable.