We have a 'master' Mercurial server on our network that we use for a local staging box. Our team does all of our pushes and pulls to/from this one box. I'm having trouble with the implementation I'm using, but I'm also second guessing whether what I want to do is even a good idea...
We also want to start using BitBucket, but only as a secondary server. I'd like to use a hook to automatically push to Bitbucket, but I can't get it working right...
Here's the HGRC from the 'master' repo:
[hooks]
changegroup =
changegroup.update = hg update
changegroup.bitbucket = hg push ssh://hg#bitbucket.org/account/repo
If I manually fire off the above push, everything works perfectly. However, as a hook it fails:
warning: changegroup.bitbucket hook exited with status 255
I followed this guide to get SSH working: Set up SSH for Git and Mercurial on Mac OSX/Linux
I get my keys generated, I run ssh-agent, and I ssh-add the key. But ssh-agent doesn't seem to be doing anything, and as soon as I exit the SSH session it seems to leave memory. Additionally, when I test it out with ssh -Tv hg#bitbucket.org it prompts me for my password. I thought the whole point of this was for it not to do that?
But taking a step back, maybe this is a terrible idea to being with. If I give my public key to Bitbucket wouldn't that theoretically mean if someone got a hold of it, they could SSH in to my box without a password?
And if so, what alternative do I have to forward commits to bitbucket? I'd rather not use HTTPS because it would require putting our bitbucket password as plain/text in the .hg/hgrc file...
Maybe there's some more obvious way to do this that I'm missing? For the developers, I'd rather keep things the way they are now (everyone push to master) instead of reconfiguring everyone's developer box to have a private key and to push to bitbucket instead...
As always, thanks for any help you guys can provide.
Woah, there are a lot of questions there. I'll hit a few of 'em:
But ssh-agent doesn't seem to be doing anything, and as soon as I exit the SSH session it seems to leave memory.
You're correct. ssh-agent is for interactive sessions, not for automation. In most usages when you log out it's killed, but even if that weren't the case it wouldn't be working as you imagine because when someone does that hg push they're running a new, non-interactive session that wouldn't have access to the ssh-agent anyway
Additionally, when I test it out with ssh -Tv hg#bitbucket.org it prompts me for my password.
Testing it out like that isn't valid. That's saying "I want to log into an interactive session at bitbucket with the username hg", but that's not what they authorize you to do. If you send them your public key they let you login as the user hg only for the purposes of doing hg non-interactive commands.
Additionally, when I test it out with ssh -Tv hg#bitbucket.org it prompts me for my password.
No, public keys are meant to be public -- you can list anyone's on github for example. The public key just says "anyone who has the private key that matches this is authorized to...", so any site that wants your private key are crooks, but any site that wants you public key is just offering you a way to use something better than a password.
One thing you may be missing about hooks is "who" the hook runs as. When people are pushing to your "centralish" repo over ssh that the hook is being run as their unix user, and if they're pushing over http the hook is being run as the web server's user.
If you had:
a private ssh key with no password on it
the public key matching that private key setup on bitbucket
the unix user running the hook using that private key for access to bitbucket.org
then what you're trying to do would work.
Related
As in the title. I WANT to type in my password every time I do a git push or pull. Currently, it prompts me to enter in the passphrase every restart and then caches it, but I don't want that.
I'm not sure what I did to cause this as it never happened before on any of my other machines across the years
With SSH, only a passphrase (if you have created a private SSH key protected with a passphrase) would be asked, then cached by the ssh-agent.
Typing a password would mean using an HTTPS URL (one requiring your remote GitHub or GitLab user account name, and password or token).
I am trying to avoid typing the password every time I want to pull or push from a gitlab repository. Therefore I followed https://docs.gitlab.com/ee/ssh/ but still, now every time I want to pull something it still asks me to provide the password for my remote gitlab repository.
Any clue on how to fix this issue?
Are you sure you are using the SSH link for your repository? It is like: git#gitlab.com:YOUR-USER/YOUR PROJECT.git
I succesfully followed these instructions from GitHub on how to generate SSH keys and my connection with GitHub is succesfull.
But when I later want to check my SSH key following these instructions I don't get the SSH fingerprint I see in my GitHub SSH Keys setting page when I use ssh-add -l.
Instead of the SSH key fingerprint I get the message The agent has no identities. Why? And what does it mean?
This means you haven't successfully added your key to your agent. Use ssh-add to do so, as given in step 3, part 2 of your first link.
Note that this needs to be done for each ssh-agent instance; thus, if you log out and back in, you need to ssh-add your key again. Similarly, if you start ssh-agent twice, in two different terminal windows, they won't have shared private keys between them, so you would need to ssh-add once in each window (or, better, configure your system in such a way as to have an agent shared between all running applications in your desktop session).
Modern desktop environments generally will provide a SSH keyring for you, so you shouldn't need to start ssh-agent yourself if your agent is so configured, and the agent instance so provided should be shared across your entire session. gnome-keyring behaves this way, as does Apple's keychain and KDE's Wallet (with ksshaskpass enabled).
I had a working instance of Gitlab until a few weeks ago, when we had to move all the user directories to another disk b/c of resource constraints. I've gone through and fixed all the paths that I could find, and so now my gitlab instance is up and running again. Git appears to be working, and I pass the gitlab self-diagnostic test.
However, from a remote client that's previously worked, I get prompted to provide the git user's password, which suggests an ssh problem.
Looking in my .gitolite stuff (conf/gitolite.conf & the keydir), things look in order. My public key is in the keydir, and the rights are assigned in the gitolite.conf correctly.
EDIT: gitolite public keys were in the .ssh/authorized_keys file and the protections were as created by gitolite setup.
What am I missing?
My public key is in the keydir, and the rights are assigned in the gitolite.conf correctly.
This isn't enough.
For ssh to not ask you for a password, you need to check if your ~gitlab/.ssh/authorized_keys is complete (with the gitolite public keys in it, and with the right protections)
Check out the gitolite setup command (for gitolite V3).
I have my Hudson CI server setup. I have a CVS repo that I can only checkout stuff via ssh. But I see no way to convince Hudson to check out via ssh. I tried all sorts of options when supplying my connection string.
Has anyone done this? I gotta think it has been done.
If I still remember CVS, I thought you have to set CVS_RSH environment variable to ssh. I suspect you need to set this so that your Tomcat process gets this value inherited.
You can check Hudson system information to see exactly what environment variables the JVM is seeing (and passes along to the build.)
I wrote up an article that tackles this you can find it here:
http://www.openscope.net/2011/01/03/configure-ssh-authorized-keys-for-cvs-access/
Essentially you want to set up passphraseless ssh keys for your build user. This will allow authentication to occur without the need to work out some kind of way to key in your password.
<edit> i.e. Essentially the standard .ssh key client & server install/exchange.
http://en.wikipedia.org/wiki/Secure_Shell#Key_management
for the jenkins user account:
install user key (public & private part) in ~/.ssh (generate it fresh or use existing user key)
on cvs server:
install user key (public part) in ~/.ssh
add to authorized_keys
back on jenkins user account:
access cvs from command-line as jenkins user and accept remote host key (to known_hosts)
* note any time remote server changes key/ip you will need to manually access cvs and accept key again *
</edit>
There's another way to do it but you have to manually log from the build machine to your cvs server and keep the ssh session open so hudson/jenkins can piggyback the connection. Seemed kinda pointless to me though since you want your CI server to be as hands off as possible.