Using Gitlab deploy keys with write access - ssh

I am currently running CE version 8.17.4 and am attempting to setup a deploy key with write access (as of 8.16) so that my runner instance may commit build artifacts back to the repository. I took the following steps to set this up:
On the runner instance, I generated the ssh keypair with the command: 
sudo ssh-keygen -t rsa -C "label" -b 4096
The generated keypair was saved to /home/gitlab-runner/.ssh/id_rsa and password protected.
Within Gitlab, I created a public deploy key from the admin console and pasted the contents of id_rsa.pub into the appropriate field and verified that the key fingerprints matched. I checked the "Write access allowed" box. 
In the private project that I wished to enable repository access from the runner, I enabled the newly created public deploy key.
This is a LaTeX document respository, so in the .gitlab-ci.yml file, I issue the following script after building the pdf:
after_script:
  - "git commit -am 'autobuild PDF'"
  - "git push origin master"
When the changes were committed, the build ran successfully on the runner up until the git push origin master command, and this error was thrown:
fatal: Authentication failed for 'http://gitlab-ci-token:xxxxxxxxxxxxxxxx#host/project.git/'
Ok. A couple questions:
If the deploy key is just an SSH key, shouldn't it be connecting on the secure port or does this matter? I haven't found much documentation on using this new write-permission deploy key feature, so am I missing something in the steps I took above?
Do I need to include [ci skip] in the commit message to avoid looping CI builds? I saw this concern come up in the original issue tickets for this feature, but did not see whether this step was required or not. 
Thanks for any help!

Jawad's comment worked for me: you need to force SSH. for example
git remote add ssh_remote git#host:user/project.git
git push ssh-remote HEAD:dev
thanks jawad

Related

cargo generate using ssh results in: Git Error: Failed to authenticate SSH session: ; class=Ssh (23)

I'm currently struggling getting cargo-generate to work properly with ssh.
Generating from github using https works fine.
I have no issues using git by itselve as well, but for some obscure reason cargo generate does not play nicely with it.
Here is the error I am presented with, after trying to clone a simple template of mine:
cargo generate git#github.com:VirtualNonsense/rust_bluepill_minimal_template.git
Using application config: C:\Users\VirtualNonsense\.cargo\cargo-generate.toml
Using ssh-identity from application config: $HOME/.ssh/id_rsa
Favorite git#github.com:VirtualNonsense/rust_bluepill_minimal_template.git not found in config, using it as a git repo url
Using private key: `%userprofile%\.ssh\id_rsa` for git-ssh checkout
Error: Git Error: Failed to authenticate SSH session: ; class=Ssh (23)
My cargo-generate.toml file consists only of the following lines:
[defaults]
ssh_identity = "$HOME/.ssh/id_rsa"
I've seen that there seems to be an issue with passphrase protected keys so I made sure mine does not have one. I also tied an ed25519 key but it did not change the result as well.
I feel like I'm missing something obvious and would appreciate some guidance😅

How to connect a project from IDEA to Gitlab

Is it possible to connect a project from IDEA to Gitlab?
There are no problems with GitHub, you specify a github account, then VCS-> Import into and it creates a project in your account. But with GitLab I don't see such a possibility. Maybe there is some way?
Or just manually throw it in there?
If all you want to do is create a project, that is as simple as pushing the repo. In GitLab, you don't have to create the project in the UI first. You can simply push directly to a project namespace that does not yet exist. The project will be created when you push to it.
Therefore, all you have to do is configure your git remote per usual, then push.
git init
git checkout -b main
echo "# My Project" > README.md
git add README.md
git commit -m "initial commit"
git push --set-upstream git#gitlab.com:namespace/myproject.git main
Then you'll see the message from the remote in the git console log
remote: The private project namespace/myproject was created.
In the JetBrains IDEs, you can simply configure the remote and push.
You only have define the remote ahead of time in the terminal. For example:
git remote add origin git#gitlab.com:namespace/project
Then in the IDE, when you go to push, you'll see the ability to push to the new remote/branch.
Then you'll also note in the git tab console you'll see the message from the remote that the project has been created.
Full integration for GitLab hasn't been implemented yet, but there is a feature request for that:
https://youtrack.jetbrains.com/issue/IDEA-109294
Meanwhile, you can create a repository in GitLab, then press Cmd/Ctrl+K and Click "Commit and push". In Push dialog there will be "Define remote" button - click on it and paste URL to GitLab's repository.
If it's HTTPS then you'll be prompted with username and password - enter them, alternatively you can enter Username and Personal access token (in password field)

Pull a private repo as a dependency during a dokku deploy?

I am using Dokku on DigitalOcean to deploy my_app.
My app has a dependency which points to a private repo git#github.com:my_org/my_app.git.
Step 10 : RUN <some_command_to_install_deps>
---> Running in ceada9d96c61
* Getting my_repo (git#github.com:my_org/my_app.git)
remote: verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
I've tried adding deploy keys using https://github.com/cedricziel/dokku-deployment-keys to no avail. Any help is greatly appreciated.
Along with the deployment keys plugin, you will also need the hostkeys plugin to add github.com to the container's known_hosts file. For example, to approve github.com across all applications on the machine, simply do...
Install the plugin...
dokku plugin:install https://github.com/cedricziel/dokku-hostkeys-plugin.git hostkeys-keys
Then run...
sudo dokku hostkeys:shared:autoadd github.com
If you still face issues after doing this, there is something wrong with your deployment keys setup. In that case, leave a comment and I'll help troubleshoot it

Jenkins: Publish over SSH after failed build

I am trying to use the Publish Over SSH plugin to publish many kinds of build artifact to an external server. Examples of build artifacts are compiled builds, XML output from testing, and JSON output from linting.
If testing or linting results in errors, the build will fail or be marked unstable. In the case of a failed build, the Publish Over SSH plugin will not copy the build artifacts, writing to the console:
SSH: Current build result is [FAILURE], not going to run.
I see no reason why I wouldn't want to publish this information if it exists, and I would like to continue to report errors as build failures. So, is there any way to force Jenkins to publish build artifacts even if the job is marked as a failure?
I thought I could use the Flexible Publish to force this, by wrapping the Publish Over SSH in an "always" condition, but this gave the same output as before on a build failure.
I can think of a couple of work-arounds:
a) store the build status in an environment variable; force the status to SUCCESS; perform the publish step; recover the build status from the environment variable using java jenkins-cli.jar set-build-status $STORED_STATUS
OR
b) Write a bash script to perform the publishing step manually using SSH, cutting out the Publish Over SSH plugin altogether
Before I push forward with either of these solutions (neither of which I like), is there any piece of configuration that I'm missing?
The solution I ended up using was to use rsync/ssh to copy the files manually using a post build script. I configured this in my Jenkins Job Builder YAML like so:
- publisher:
name: publish-to-archive
publishers:
- post-tasks:
- matches:
- log-text: ".*"
script: |
ssh -i ${{HOME}}/.ssh/id_rsa jenkins#archiver "mkdir -p {archive_path}"
rsync -Pravdtze "ssh -i ${{HOME}}/.ssh/id_rsa" {source_path} jenkins#archiver:{archive_path}
Quoting old hooky on jenkinsci-users:
How can I force Publish Over SSH to work even if the build has been marked
a failure?
Use "Send files or execute commands over SSH after the build runs" in
configuration section "Build environment"
Job configuration / Build Environment / Send files or execute commands over SSH after the build runs
instead of using a post-build or build-step.

New repository, production problem

I have a problem with the deployment of the project on the production server. We use Capistrano and Passenger. The problem is that we moved the project repository on GitHub to another account. I changed the repository address in the file deploy.rb, however, during the 'cap production deploy ", after authentication by the production server, Capistrano is looking for an old repository, which fails. I suspect that this is a change in the repository. git on production, but I do not know how to do it.
servers: ["85.xxx.xxx.xxx"]
Password:
[85.xxx.xxx.xx] executing command
** [85.xxx.xxx.xx:: err] ERROR: repo / repo.git does not exist. Did you enter it correctly?
** [85.xxx.xxx.xx:: err] fatal: The remote end hung up unexpectedly
command finished in 4220ms
*** [deploy: update_code] rolling back
Try editing shared/cached-copy/.git/config and modify the git repo listed there. If you're using the remote_cache method, it keeps a local git repo and updates that on the remote machine. Repoint that to your new git repo and you should be good to go.