How to use ssh identity file with Github Actions - ssh

I'm in the throes of setting up a Github Action that should run an SSH command to connect to a private server. The private server's connection settings i have specify an identityFile, which I do own. After this connection, I will then run a proxycommand, so this is essentially to a bastion, for context.
What I cannot quite figure out at this point is how/which github action supports this configuration. I see the commands on this one (similar to others): https://github.com/appleboy/ssh-action/blob/master/action.yml and no mention of identifyFile property. Is there another way to execute this or a ssh command that can make this possible?
Would appreciate some pointers, thanks!

If you need some explanation of how to write your action, you can read this article : How to create Github Actions to run tests with docker services .
You just have to create your workflow file and use the actions of appleboy like on steps keyword :
- name: executing remote ssh commands using password
uses: appleboy/ssh-action#master
with:
host: ${{ secrets.HOST }}
username: ${{ secrets.USERNAME }}
key: ${{ secrets.KEY }}
key_path: ${{ secrets.KEY_PATH }}
password: ${{ secrets.PASSWORD }}
port: ${{ secrets.PORT }}
script: whoami
With the script line, you can execute what you want to do in the server and connect with the parameters set above. For multiple line do like this :
script: |
pwd
ls -al
Hope it will help.

Related

Giving container the same permissions with the workflow in GitHub actions

I am using workload identity federation to provide some permissions to my workflow.
This seems to be working fine
- name: authenticate to gcp
id: auth
uses: 'google-github-actions/auth#v0'
with:
token_format: 'access_token'
workload_identity_provider: ${{ env.WORKLOAD_IDENTITY_PROVIDER }}
service_account: ${{ env.SERVICE_ACCOUNT_EMAIL }}
- run: gcloud projects list
i.e. the gcloud projects list command is successful.
However, in a next step I am running the same command in a container
- name: run container
run: docker run my-image:latest
and the process fails (I don't have access to the logs for the moment but it definately fails)
Is there a way to make the container created having the same auth context as the workflow?
Do I need to bind mount some token generated perhaps?
export the credentials (option provided by the auth action)
- name: authenticate to gcp
id: auth
uses: 'google-github-actions/auth#v0'
with:
token_format: 'access_token'
workload_identity_provider: ${{ env.WORKLOAD_IDENTITY_PROVIDER }}
service_account: ${{ env.SERVICE_ACCOUNT_EMAIL }}
create_credentials_file: true
Make credentials readable
# needed in the docker volume creation so that it is read
# by the user with which the image runs (not root)
- name: change permissions of credentials file
shell: bash
run: chmod 775 $GOOGLE_GHA_CREDS_PATH
Mount the credentials file and perform a gcloud auth login using this file in the container
- name: docker run
run: |
docker run \
-v $GOOGLE_GHA_CREDS_PATH:${{ env.CREDENTIALS_MOUNT_PATH }} \
--entrypoint sh \
${{ env.CLUSTER_SCALING_IMAGE }} \
-c "gcloud auth login --cred-file=${{ env.CREDENTIALS_MOUNT_PATH }} && do whatever"
The entrypoint can of course be modified accordingly to support the case above

github actions: SSH into droplet and run code

I want to deploy a github project automatically through github actions when I push my code to github. My yaml-file looks like this:
name: push-and-deploy-to-server
on:
push:
branches: [ main ]
jobs:
build-and-deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout#v2
- uses: appleboy/scp-action#master
with:
host: ${{ secrets.SSH_HOST }}
port: 22
username: ${{ secrets.SSH_USERNAME }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
source: "."
target: "."
- uses: appleboy/ssh-action#master
with:
host: ${{ secrets.SSH_HOST }}
port: 22
username: ${{ secrets.SSH_USERNAME }}
key: ${{ secrets.SSH_PRIVATE_KEY }}
script: |
npm install
pm2 restart index.js
I have a server with an SSH keypair. The public key is added to the server authorized_keys, and I can SSH through my terminal to the server.
When I push code to the github repo, the action runs. I get the following error:
drone-scp error: ssh: handshake failed: ssh: unable to authenticate, attempted methods [none publickey], no supported methods remain
The weird thing is: after this error, I'm not able to SSH into my server anymore, even through my console I get a "Permission denied (publickey)". So before running the github action, everything works, after that it fails.
The ip address of the server is SSH_HOST, the username which I use to SSH into the server is set in SSH_USERNAME and the private key (the same as I use on my local laptop to ssh into the server) is set in SSH_PRIVATE_KEY.
Does anyone have encountered the same problem before? I have really no clue whats going on here.
Edit: extra information: it's a private repository.

Bitbucket ssh pipeline fails - Missing or empty command string

I was trying to SSH to my server and pull the code and do some configuration stuff, each time code is pushed to master branch. I defined all of my repository variables used in this yaml file.
I also added ssh key, added host in the list of known hosts and fetched fingerprint.
This is my bitbucket-pipelines.yml file:
image: atlassian/default-image:2
pipelines:
branches:
master:
- step:
script:
- name: "SSH Deploy to production web"
- pipe: atlassian/ssh-run:0.2.6
variables:
SSH_USER: $SSH_USER
SERVER: $SSH_SERVER
COMMAND: $SSH_COMMAND
PORT: $SSH_PORT
The error I get is:
I checked my yml file using bitbucket validator and everything seems to be OK.
I would appreciate any help since I just started using bitbucket pipelines.
name isn't a property of script.
Refactor to be
master:
- step:
name: "SSH Deploy to production web"
script:
- pipe: atlassian/ssh-run:0.2.6
variables:
SSH_USER: $SSH_USER
SERVER: $SSH_SERVER
COMMAND: $SSH_COMMAND
PORT: $SSH_PORT

How can I extract secrets using GitHub Actions?

I have a fairly basic scenario. I made a dedicated ssh key for this purpose and added it to my repository secrets.
Code gets pushed to master
GitHub action uploads it to server using ssh by doing echo "${{ secrets.SSH_KEY }}" > key.
After that I can use this key to connect to my server e.g. ssh -i key devops#myserver.com lsb_release -a
The problem is that for some reason GitHub actions cannot write it to file, it writes characters *** instead of the actual secret value into the file. Therefore obviously I cannot connect to my server.
How can I connect with ssh using this secret? Is there a way to connect without using a file? Can someone who did this common scenario using GitHub actions shed some light?
GitHub Actions should be able to write a secret to a file this way. The reason you see the stars is that the log is filtered, so if a secret would be logged, it's replaced in the log with three asterisks instead. This is a security measure against an accidental disclosure of secrets, since logs are often publicly available.
However, it's a good idea to avoid writing the secret to the log anyway if possible. You can write your command like this so you don't write the secret to the log:
- run: 'echo "$SSH_KEY" > key'
shell: bash
env:
SSH_KEY: ${{secrets.SSH_KEY}}
All you'll see in the log is echo "$SSH_KEY" > key, not the secret or any asterisks.
Note that you do want quotes here, since the > character is special to YAML.
If this doesn't work to log into your server, there's likely a different issue; this technique does work for writing secrets in the general case.
I found a way to do this. We used GatsbyJS for a project and it relies on a
.env.production file for the env variables. I tried to pass them as
env: to the github action, but that didn't work and they were ignored.
Here is what I did. I base 64 encoded the .env.production file:
base64 -i .env.production
Added the output to an env variable in github action. Then in my action I do:
echo ${{ secrets.ENV_PRODUCTION_FILE }} | base64 -d > .env.production
This way the contents of my .env.production file ended being written to the machine that executes the github action.
I used sed in a GHA to replace a TOKEN in a file like this :
run: |-
sed -i "s/TOKEN/${{secrets.MY_SECRET}}/g" "thefile"
The file looks like this:
credentials "app.terraform.io" {
token = "TOKEN"
}
encode it and decode back
- run: 'echo "$SSH_KEY" | base64'
shell: bash
env:
SSH_KEY: ${{ secrets.PRIVATE_KEY }}
and decode it back echo "<encoded string>" | base64 -d
Here is how to solve your actual problem of securely logging into an SSH server using a secret stored in GitHub Actions, named GITHUB_ACTIONS_DEPLOY.
Let's call this "beep", because it will cause an audible bell on the server you login to. Maybe you use this literally ping a server in your house when somebody pushes code to your repo.
- name: Beep
# if: github.ref == 'refs/heads/XXXX' # Maybe limit only this step to some branches
run: |
eval $(ssh-agent)
ssh-add - <<< "$SSH_KEY"
echo "* ssh-rsa XXX" >> /tmp/known_hosts # Get from your local ~/.ssh/known_hosts or from ssh-keyscan
ssh -o UserKnownHostsFile=/tmp/known_hosts user#example.com "echo '\a'"
env:
SSH_KEY: ${{ secrets.PMT_GITHUB_ACTIONS_DEPLOY }}
If, actually you are using SSH as part of a rsync push task, here is how to do that:
- name: Publish
if: github.ref == 'refs/heads/XXX'
run: |
eval $(ssh-agent)
ssh-add - <<< "$SSH_KEY"
echo "* ssh-rsa XXX" >> /tmp/known_hosts
rsync $FROM user#server:
env:
SSH_KEY: ${{ secrets.GITHUB_ACTIONS_DEPLOY }}
RSYNC_RSH: "ssh -o UserKnownHostsFile=/tmp/known_hosts"
You can decode a secret by looping through it with python shell, like this:
- name: Set env as secret
env:
MY_VAL: ${{ secrets.SUPER_SECRET }}
run: |
import os
data = open("file", "w")
for q in (os.getenv("MY_VAL")):
print q
data.write(q)
shell: python
This will both print each character to stdout and store them in file called file. stdout will have an output like this, while file should have the secret string saved inside.
s
e
c
r
e
t
The good solution is to use gpg for encrypting the key, adding it to a repo and decrypting it on the server using passphrase. The passprase should be stored as github project secret of course.
More info how I did it here: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/creating-and-using-encrypted-secrets
echo ${{secrets.AWS_ACCESS_KEY_ID}} | sed 's/./& /g'

BitBucket deployment using SSH keys to remote server

I am trying to write a YAML pipeline script to deploy files that have been altered from my bitbucket repository to my remote server using ssh keys. The document that I have in place at the moment was copied from bitbucket itself and has errors:
pipelines:
default:
- step:
name: Deploy to test
deployment: test
script:
- pipe: atlassian/sftp-deploy:0.3.1
- variables:
USER: $USER
SERVER: $SERVER
REMOTE_PATH: $REMOTE_PATH
LOCAL_PATH: $LOCAL_PATH
I am getting the following error
Configuration error
There is an error in your bitbucket-pipelines.yml at [pipelines > default > 0 > step > script > 1]. To be precise: Missing or empty command string. Each item in this list should either be a single command string or a map defining a pipe invocation.
My ssh public and private keys are setup in bitbucket along with the fingerprint and host. The variables have also been setup.
How do I go about setting up my YAML deploy script to connect to my remote server via ssh and transfer the files?
Try to update the variables section become:
- variables:
- USER: $USER
- SERVER: $SERVER
- REMOTE_PATH: $REMOTE_PATH
- LOCAL_PATH: $LOCAL_PATH
Here is am example about how to set variables: https://confluence.atlassian.com/bitbucket/configure-bitbucket-pipelines-yml-792298910.html#Configurebitbucket-pipelines.yml-ci_variablesvariables
Your directive - step has to be intended.
I have bitbucket-pipelines.yml like that (using rsync instead of ssh):
# This is a sample build configuration for PHP.
# Check our guides at https://confluence.atlassian.com/x/e8YWN for more examples.
# Only use spaces to indent your .yml configuration.
# -----
# You can specify a custom docker image from Docker Hub as your build environment.
image: php:7.2.1-fpm
pipelines:
default:
- step:
script:
- apt-get update
- apt-get install zip -y
- apt-get install unzip -y
- apt-get install libgmp3-dev -y
- curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
- composer install
- cp .env.example .env
#- vendor/bin/phpunit
- pipe: atlassian/rsync-deploy:0.2.0
variables:
USER: $DEPLOY_USER
SERVER: $DEPLOY_SERVER
REMOTE_PATH: $DEPLOY_PATH
LOCAL_PATH: '.'
I suggest to use their online editor in repository for editing bitbucket-pipelines.yml, it checks all formal yml structure and you can't commit invalid file.
Even if you check file on some other yaml editor, it may look fine, but not necessary according to bitbucket specification. Their online editor does fine job.
Also, I suggest to visit their community on atlasian community as it's very active, sometimes their staff members are providing answers.
However, I struggle with plenty dependencies needed to run tests properly. (actual bitbucket-pipelines.yml is becoming bigger and bigger).
Maybe there is some nicely prepared Docker image for this job.