How to setup ssh for no passwords from different accounts - ssh

I would like to ssh from user1#machine1.com to user2#machine2.com without passwords.
I can do this using passwords.
With using ssh-keygen to set up ~user1/.ssh/id_rsa on machine1.com and
~user1/.ssh/authorized_keys on machine2.com, using the instructions in ssh manual page,
I can, from user1 on machine1.com "ssh user1#machine2.com" without any passwords.
But I can not login without passwords using the same setup procedure, albeit substituting for the destination user2 for user1, i.e., from user1 on machine1.com "ssh user2#machine2.com" does not work without passwords.
The instructions I have found for doing this seem to suggest that I need a user2 account
on machine1.com to setup the keys and then copy them to the user1 account on machine1.com and user2 on machine2.com appropriately.
Is this so? Can the necessary keys be generated on machine1.com using account user1 for logging into user2 on machine2.com?
Update: I have tried this using a third computer machine3.com instead of machine2.com and it works as desired. I found several posts of the web with the same generic problem but without any solutions. Any WAGs at what might be the problem?
Thanks in advance

You can copy the same data into ~user2/.ssh/authorized_keys on machine2 that you copied into ~user1/.ssh/authorized_keys, so you would basically reuse the same key for both accounts. It's up to you to decide if this is a good idea.
You can, however, also generate a second key, but then you'd have to explicitly specify on the command line which key file is to be used for authorization from machine1.

Related

How is GitLab/GitHub authentication separated from an ordinary SSH-session?

I read the question: How does the GitHub authentification work? and https://unix.stackexchange.com/questions/315615/is-ssh-public-key-associated-with-a-user Which is exactly what I am wondering. I am still missing a better answer.
When I test my SSH-key-pair I connect to user git#gitlab.com. My stored Public key has a fingerprint of base64. When the SSH Client(me) want to connect to the server(My gitlab/github account server) it sends its ID(fingerprint), the server checks it ".ssh/authorised_keys" and loops through the Fingerprints after the correct public key to encrypt the challenge.
On Github/Gitlab there are several thousand of users, they all use the same username ("git") to initiate a web (SaaS)session. So how is this separated on the server? I don't get root access on gitlab/github, of course. I only get access to my account though the generic user-session git#gitlab.com. But how is this implemented?
When I use SSH in other situations I have a specific username which I use to [my-username]#router.com
E.g.
If I would set up my own GitLab on a local NAS/Server. How can I create an account (User#local-gitlab.com) but the access rights are limited to the Fingerprint of the differents users SSH-key-pairs?
User: ID:001
User: ID:002
User: ID:003
Somehow I need to limit the access for ID:001 when he/she initiate a ssh-session with my server on account "User".
I can't speak for GitLab, but for GitHub, there is a dedicated service that terminates these connections, contacts the authentication service with the key in question, and then receives the response about whether the user is allowed to access that repo, and if so, contacts the servers storing the data.
GitHub has more than 65 million users, many users have multiple SSH keys, and there are also deploy keys for servers, so using the command directive with an OpenSSH authorized_keys file would be extremely slow, since it would involved parsing and reading probably gigabytes of data each time a connection was made.
If you need this yourself for a small set of users, the command directive in authorized_keys is a viable approach. If you need something more scalable, you can create a custom server with something like libssh and perform authentication yourself, either in that process, or in a separate process.
I found this question+answer: https://security.stackexchange.com/questions/34216/how-to-secure-ssh-such-that-multiple-users-can-log-in-to-one-account. Which highlights that you can put restrictions on authorised_keys. Don't know if that provides precise answer for my question, but it looks like it.
command="/usr/local/bin/restricted-app",from="192.0.2.0/24",no-agent-forwarding,no-port-forwarding,no-x11-forwarding ssh-rsa AAAA… git#gitlab.com
I guess there is several thousand of those lines at gitlabs/githubs servers in .ssh/authorized_keys where every single line points out access to only that gitlab/hub account.
Please comment if you don't agree.

Multiple SSH Keys for same user on same host

I want to access a host with a specific user, but I want this user to have multiple SSH keys.
Why? This is the user for deployment on the server and there are multiple developers who have to deploy. I'd like to use a different key for each developer.
(Yes, I could create multiple deployment users, but that's quite costly on this managed server)
Example:
bob#bobs-workstation$ ssh -i ~/.ssh/id_rsa.bob deploy#host.com
alice#alices-workstation$ ssh -i ~/.ssh/id_rsa.alice deploy#host.com
Is this even possible?
In similar questions it's always about different users or different hosts and multiple SSH keys, but in this case it's about the same user and the same host with multiple SSH keys.
Turns out I found no questions about that because it's the most trivial case there is:
Yes, it's possible for a single user to accept multiple public SSH keys.
The text of the key files all have to be copied into /home/deploy/.ssh/authorized_keys (deploy was the user in the above example).
This is what the content of authorized_keys could look like:
ssh-rsa *bobsunintellegiblepublickeyformultiplelines* bob_at_deploy#host.com
ssh-rsa *alicesunintellegiblepublickeyformultiplelines* alice_at_deploy#host.com

Remotely running commands via ssh

I need to automate the copy of a zip file to a remote Linux machine and then the unzipping of that file to a user's home directory.
Let's assume we have user1 and user2, user1 is a real person but has no home directory and user2 is an application user that has a home directory but cannot directly get shell access to a host. The mechanism to gain a shell for user2 is to ssh to the box as user1 and then su to user2. (please do not pass comment on this setup as I work for a large corporation and I am unable to change this aspect, it is decided by IT security and not up for discussion).
I would like to use
scp ziptocopy.zip user1#hostname:/var/tmp/
but as I don't have anywhere on the remote host to store a key file for user1 I cannot us public/private key pairs to perform this, can anyone suggest a way to do this?
The next piece is even more tricky as I want to ssh as user1 and then su to user2 and run
unzip /var/tmp/ziptocopy.zip
Again any suggestion on how I can do this? I have done a search and found an example that uses expect, this has potential for the scp but I cannot get this to work, but how would I get expect to cope with 2 password prompts?
Thanks
I'd look into using Python and the pexpect/pxssh modules. These days pexpect is a nice alternative for automation tasks that used to be dealt with using Expect.
See this answer for some example code where an SSH session is scripted:
How to make a ssh connection with python?

Run script as another user without sudo/su privileges

I'm trying to write a script so that it can be called by one user and is executed as another user. I thought that setuid might be able to do this so I enabled setuid using chmod u+s with the owner of the script being user1. I call the script (which only contains whoami right now) as user2 and it still shows user2 instead of user1. How can I make this be user1.
-- My end result is I want one user to be able to call this script and have it ssh into another server and execute a command as another user.
You can copy that user's key (id_rsa) and pass it to ssh when connecting to the server:
ssh -i user1_id_rsa user1#server
However, this is rather a bad solution, security-wise. Adding the user's key to the authorized keys on the server, as I said in the comment, is the proper way to do it, and you should really look into that.
Sounds like you need a third user in your security model, who can run the program, but is otherwise unprivileged. This third user is an assumable identity for a number of users so they can run the process on the remote server.

How to use ssh command in shell script?

I know that we shuld do
ssh user#target
but where do we specify the password ?
Hmm thanks for all your replies.
My requirement is I have to start up some servers on different machines. All servers should be started with one shell script. Well, entering password every time seems little bad but I guess I will have to resort to that option. One reason why I don't want to save the public keys is I may not connect to same machines every time. It is easy to go back and modify the script to change target addresses though.
The best way to do this is by generating a private/public key pair, and storing your public key on the remote server. This is a secure way to login w/o typing in a password each time.
Read more here
This cannot be done with a simple ssh command, for security reasons. If you want to use the password route with ssh, the following link shows some scripts to get around this, if you are insistent:
Scripts to automate password entry
The ssh command will prompt for your password. It is unsafe to specify passwords on the commandline, as the full command that is executed is typically world-visible (e.g. ps aux) and also gets saved in plain text in your command history file. Any well written program (including ssh) will prompt for the password when necessary, and will disable teletype echoing so that it isn't visible on the terminal.
If you are attempting to execute ssh from cron or from the background, use ssh-agent.
The way I have done this in the past is just to set up a pair of authentication keys.
That way, you can log in without ever having to specify a password and it works in shell scripts. There is a good tutorial here:
http://linuxproblem.org/art_9.html
SSH Keys are the standard/suggested solution. The keys must be setup for the user that the script will run as.
For that script user, see if you have any keys setup in ~/.ssh/ (Key files will end with a .pub extension)
If you don't have any keys setup you can run:
ssh-keygen -t rsa
which will generate ~/.ssh/id_rsa.pub (the -t option has other types as well)
You can then copy the contents of this file to ~(remote-user)/.ssh/authorized_keys on the remote machine.
As the script user, you can test that it works by:
ssh remote-user#remote-machine
You should be logged in without a password prompt.
Along the same lines, now when your script is run from that user, it can auto SSH to the remote machine.
If you really want to use password authentication , you can try expect. See here for an example