SSH authentication in Artifactory - ssh

I tried reading Artifactory user guide but the instructions on SSH authentication were not clear. Can someone explain how to do SSH authentication in Artifactory?

Actually, enabling SSH on Artifactory is fairly straight forward, the client is what may require some additional debugging if it is unable to connect for any reason. The steps for enabling SSH on Artifactory are available in the online documentation for SSH Integration. You simply need to create a key pair on any machine with ssh-keygen installed (most linux distros will have this by default), then click on Admin, select Security -> SSH Server, click Enable SSH and add in the private and public key just created. Select a port and set the custom base url if necessary and Save.
Now, the user in Artifactory that wishes to authenticate with SSH needs to add his public key to his profile. This can be done by simply logging in and clicking your username in the top right corner of Artifactory. Under this section, you will need to add your password again and then you can simply paste the public key in the SSH section, you can read about this process in Updating Your Profile.
That's it, Artifactory is now ready for SSH for that particular user, and any other user can add their public key to their profile to use SSH authentication.
Configuring the client depends on which client you are attempting to setup. The most common use case is GitLFS, so I will share some documentation for setting up Git LFS with SSH to Artifactory.
Most of what you need to setup Git LFS can be found in JFrog's Git LFS Repository Authenticating with SSH documentation, or in JFrog's public solution on Git LFS Authentication. The latter contains an example of what the git config file should look like and also contains relevant information on setting up SSH authentication with an nginx reverse proxy (if you have one configured and running).
If this doesn't answer your question, can you please provide some more details on which client you are using to authenticate and specifically what is not working (any relevant error messages or log output), both Artifactory and client-side.

Related

Automatic push on commit (on bitbucket repository)

I have a website which commits to a git local repository all the changes done.
I need to automatically push them directly from the web server to a remote Bitbucket repository.
The authentication method I use to do it is by passwordless SSH key stored on the Web Server, but I think this could be not so secure.
So the question is... do you know a more secure (or simply better) method to automatically push changes from a Web Server to a Git-Bitbucket private repository?

Automated authentication to Private Docker Registry

I have a private docker registry hosted on GitLab.
I need to support accessing this registry automatically (via Gitlab CI), but I don't want to use developer credentials (insecure and requires changing them every time a dev leaves the company).
How are others authenticating?
Do you create an "API Account" to authenticate with? Docker doesn't seem to support service account keys or other methods of authentication.
Thank you
Edit:
GitLab CI ssh registry login
The accepted answer here answered the question for GitLab.
However, I would like to know if there is any alternatives, since this still only allows ephemeral keys while doing the deployment via GitLab CI

ftp through filezilla to google cloud machine, can't achieve it

before asking this question i looked through google and tried different alternatives none of which were successful for me, sadly. I'm a little above the noob level. What i want is to basicaly host a wordpress site on a google cloud debian machine.
I was doing good installing services through their SSH access until i got to the point where i installed an ftp service and wanted to access it through a remote computer(my own) i only got as far as to:
Status: Waiting to retry...
Status: Connecting to 104.197.183.19...
Response: fzSftp started
Command: open "root#104.197.183.19" 22
Error: Connection timed out
Error: Could not connect to server
I kept on looking and trying new ways until i found the gcloud documentation for ftp but it is not aimed at new ones, so my questions are:
Where do i input the commands for gcloud, on my computer or on the SSH console(Google cloud machine)?
Do i need to use gcloud for ftp remote access or can i do it entirely through my computer and their SSH machine?
Do i really need to add an ssh authorization file to FileZilla or is there a way i can disable that check on my vps so it lets me sign in with just a username and a password?
What i already tried and didn't work for me:
gCloud documentation for ssh and ftp
Google cloud documention for setting up a wordpress site
Many others
Basically what i need in short is to manage to access the vps through ftp so i can continue with my learning.. Been stuck there two days.
To get access to a users public area, ie. public_html
Go to the accounts Cpanel area and under Security > SSH Access you can import a key file.
You can use PuttyGen to make one, you will need both a private and public key.
Past the keys into the box's.
You may get a warning message about the private key, this is ok.
Go to Manage under public key and authorize it.
Or
Make on using the interface in Cpanel and download both Keys.
Then in FileZilla
Host: IP of server
Protocol: SFTP
Logon Type: Key File
Key File: the PPK you made.
(if you asked Cpanel to make the file select the one that does not end in .pub and FileZilla will convert it for you to a .ppk file.
After clicking connect you should be in
If you still have an error make sure the SSH port (22) is open in your filewalls both Google cloud.google.com > Networks and WHM > LDF/CSF plugin
Use SSH File Transfer Protocol.
No need to install ftp service.
Use winscp for connecting with sftp.
The recommended way of transferring files to a Unix-based Google Compute Engine VM is via the gcloud compute copy-files command. For this, please install the Google Cloud SDK. Then, run a command such as the following:
gcloud compute copy-files --zone=<Compute Engine zone>/path/to/local/file.txt <Compute Engine instance name>:/path/to/destination/file.txt
If you'd like to use FileZilla, you'll have to configure it for access. The SSH daemon on Compute Engine VMs is set up for key-based authentication. This forum post indicates how this is possible in FileZilla. The catch is that you need to put your public key on the VM, which can be a little tricky. gcloud compute copy-files and gcloud compute ssh take care of this for you, which is why they are the recommended method.

passwordless ssh authentication using active directory

Our current infrastructure uses ssh keys for passwordless login to our Linux servers.
As our infrastructure grows, managing these authorised keys is getting harder.
As we also have an Active Directory (AD) server, I would like to authenticate the users over ssh using this mechanism, but maintain the passwordless nature of ssh keys.
Is it possible to authenticate the users over ssh without password, using some AD mechanism?
This is usually done via SSH key certificates in order to keep the password-less nature and at the same time have a Central Authority that can be trusted to generate new certificates for each account.
LDAP/Active directory use on login is not advised - apart from having to use passwords, it also becomes a single point of failure for access to any system it manages.
See RedHat documentation on how to do this and also Facebook's good write up on their use of certificate authentication with SSH.
Option 1
This is a good article explaining how to do this.
Storing SSH keys in Active Directory for easy deployment
Basically, it will allow people to post their public keys to your Active Directory and then you can set up a cron script on your servers to fetch a copy of the public keys every 5 minutes or so.
Option 2
You could also use a file server that has all your keys and get each server to fetch from there using a cron script. Obviously, you need a way to verify each key's authenticity especially if you are using FTP or some other insecure protocol. This could be achieved using GPG. You could have a company master GPG key that signs all the employee keys.
Personally, I like option 2 the best because I think it is more secure, but either method should work. Hope this helps!
My approach would be to reduce the problem to an already solved one by
Use active directory to authenticate without password and establish an HTTPS connection using Kerberos. The Dzone Tutorial Configuring Tomcat 7 Single Sign-on with SPNEGO might be a good starting point for that approach.
Wrap SSH into the https-protocol like, see section Wrapping SSH in HTTP(S) at https://unix.stackexchange.com/questions/190490/how-to-use-ssh-over-http-or-https

How to access to remote server

I want to create a repository on the remote server .
Access constraint that I have :
(a) IP address (of server)
(b) username/pw
I am following this tutorial and stuck in the first step :"Initial access to mercurial-server"
I am not able to understand those "ssh connection" syntax (specially the my-key)
How could I connect to remote server(using ssh-agent ) i order to create new repo .
This is the same problem we see again and again. mercurial-server isn't a part of Mercurial. It's a separate, third party, not generally necessary piece of software that tries to make mercurial administration easier without really succeeding.
Start here: https://www.mercurial-scm.org/wiki/PublishingRepositories/
and pick the type of access you want, http or ssh and then use either hgweb.cgi + apache (for http) or nothing at all if you just want to use ssh.
Specifically, for any server that has the mercurial client on it (apt-get install mercurial on debian or ubuntu and yum install mercurial on redhat, fedora, or centos) you don't need any extra software at all for hosting mercurial repositories over ssh. You can just do:
hg clone myLocalrepo ssh://you#thatserver/myRemoteRepo
and poof you're hosting there.