Unable to get public/private key authentication working for Azure Spring Cloud Config Server - spring-cloud-config

I am having difficulty setting up my Config Server in Azure Spring Cloud using a Github repo as backend. I have it working using basic authentication where I create a token in Github which is fine for my experiments but this is not suitable for production.
I have set up my public key in Github and tested whether my setup is correct by using the following command, in essence cloning the repo using a specific private key:
GIT_SSH_COMMAND='ssh -i ../azure_id_rsa -o IdentitiesOnly=yes' git clone git#github.my-account/azure-config-server.git
Locally this works just fine so this seems to confirm that my setup in Github is correct and using the private key I should be able to clone the repo in Azure you would think.
However, if I then follow the instructions as described here in the official Azure documentation to set up my config server using the GUI, I get the following error:
Failed to update Config Server.
Reason: Fail to update config server due to 'Health check timeout with 10 minutes'.
So I tried using the "Import settings" option by uploading a yaml file. I have used this Azure template where I then paste my private key using the "private-key" section (and yes, the casing for Azure needs to be like that, according to the Azure documentation they only support the properties using hyphens rather than camel casing) as described here in the Spring documentation
But I consistently get the same error so I would think that there is something wrong with my setup but my options are exhausted. If anyone has any pointers it would be much appreciated.

Ok, I figured it out, just posting my answer here in case it helps someone. I just noticed the one difference between my private key and the one mentioned in the Spring example. My private key started with "-----BEGIN OPENSSH PRIVATE KEY-----" whereas in the Spring documentation it starts with "-----BEGIN RSA PRIVATE KEY-----". In other words it expects it to be in the pem format rather than the OpenSSH format.
So I now got it to work by generating my key as follows (where the noteworthy flag is the "-m pem"):
ssh-keygen -t rsa -m pem -b 4096 -C "my#email.com"
And then when I set up my public key in Github and pasted my private key in Azure Config Server it actually worked.
Hope this helps save someone some time as it cost me quite some figuring out.

Related

gitLab: certificat issue, missing ssh public key

i can't get a point and understand how it works and what is necessary to do.
I have an account by GitLab and successfully generated private and public certificate in order to provide access to it. I done all steps as describes the https://gitlab.com/help/ssh/README#generating-a-new-ssh-key-pair . Now i decided to create a new project and synchronize the state between gitLab project and one i created local by me. Because i have access to machine, which i used to create both certificate, i simply copied the public key from one machine (located in ~/.ssh folder) to current machine i am working in (in ~/.ssh folder). But it doesn't take any effect. I can't even execute the git clone command.
~> git clone git#gitlab.com:[myUser]/[myProject].git
Cloning into 'gate-controller'...
git#gitlab.com: Permission denied (publickey).
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
i tried to figure out the reason and executed
~> ssh -vT git#gitlab.com
but to be a honest i can't interpret that response. I don't see in response any reference to my public keys file in ssh folder.
Could you please support me to solved the issue and understand, what is the problem ?
many thanks in advance
UPDATE:
You need the private key on any machine you're attempting to pull/push from. When authenticating with a service that has your public key (which any git service like Github, Gitlab, etc. will have) you need to use your private key when authenticating.
You can read more about ssh (which git uses when you don't use HTTPS auth) and PKI (Public Key Infrastructure) here https://www.ssh.com/pki/

Can't connect to SFTP (with private key file) in Copy Data Tool

I am trying to copy data from SFTP to blob but got stuck when creating SFTP source.
I have the connection details and can easily connect on Filezilla or WinSCP. However, I am unable to get it to work in Azure data factory.
I am not using code but the user interface.
The connection details on the page creating the SFTP source:
Connect via integration runtime: AutoResolveIntegrationRuntime (default)
Host: xyz
Port: 22 (can't remove it as it doesn't like it)
SSH Host Key Validation: Enable SSH Host Key Validation
SSH Host Key Finger-print: taken from WinSCP - Session - Server/protocol information
Authentication type: SSH Public Key Authentication -can't use basic as the private key holds the security info
User name:XXX
Private Key Type: Use Key Content
Private key content: loaded the .ppk file, tried also tried loading the .pem file and got different errors
Pass Phrase: none
When setting up this sftp in WinSCP or FileZilla it automatically converted the provided .pem file into .ppk.
When I loaded the .ppk file into ADF I got an error: Invalid Sftp credential provided for 'SshPublicKey' authentication type.
When I loaded the .pem file I got: Meet network issue when connect to Sftp server 'spiderftp.firstgroup.com', SocketErrorCode: 'TimedOut'.
I have also tried 'Disable SSH Host Key Validation' in SSH Host Key Validation and made no difference.
I have also opened the .ppk file in PuttyGen and used that host key finger print and still no luck.
Only getting these 2 errors depending on which file I load.
Can't find anything about this online so would be grateful for some advice.
Have you read this note in this doc?
https://learn.microsoft.com/en-us/azure/data-factory/connector-sftp#using-ssh-public-key-authentication
SFTP connector supports RSA/DSA OpenSSH key. Make sure your key file content starts with "-----BEGIN [RSA/DSA] PRIVATE KEY-----". If the private key file is a ppk-format file, please use Putty tool to convert from .ppk to OpenSSH format.
Got this working today. Like you, could connect using WinSCP and failed when using ADF.
The link Fang Liu shared contains our answers, but my issue was not the private key. I suspect Fang's suggestion resolved your problem and I'm sharing my answer here to help others who may encounter similar.
My issue:
When using Private Key Authentication in ADF the password becomes a Pass Phrase and you no longer have the ability to supply a password. To overcome the problem we disabled password authentication for the user and the SFTP connection started working.
As stated in the documentation. The Pass Phrase is used to decrypt the private key if it is encrypted.
Also worth noting:
If you store the contents of the private key in Key Vault you need
to base64 encode the entire contents of the exported key and use
that string. This includes "-----BEGIN RSA PRIVATE KEY-----" and the
end. The same applies if you want to paste the value into the
textbox of the SFTP linked service edit screen.
I did not try to manually edit the JSON of the Linked Service to explicitly provide a password and this could be workaround for someone to test if they are unable to disable the password.
I used PuTTYGen to export the PPK to a private key and had the same fingerprint issue too so I just disabled cert validation. Funnily you can use the fingerprint provided by the error and it passes validation so not sure where the bug lies. :-)

SFTP - From WinSCP to Terminal Access

I have been able to set up SSH access to my Google Cloud Platform VM via SFTP using WinSCP, but I now wish to do the same using another VM.
I have tried the ssh-keygen -t rsa , ssh-copy-id demo#198.51.100.0 method but always come up against the "Permission denied (public key)" error which from researching seems to be a pretty widespread issue with few reliable fixes (all the ones I tried didn't work).
I used PuttyGen to create the public and private key, and inserted the public key onto the server just through GCP settings, adding it under the SSH settings for my instance.
I am just confused on what to do with the private key when simply trying to sftp through the terminal on a separate VM, as before I would load the private key into WinSCP settings. Is there a folder I need to place it in or?
Regarding your first issue of "Permission denied (public key)" error, please follow the troubleshooting in this link and this.
About your other question of "what to do with the private key when simply trying to sftp through the terminal", that depends on the settings of the specific the 3rd party SFTP tool you are using. To locate the locations of SSH key after generating them, please review this document.
Once you have added the public key in the VM, you would need to boot the VM for public key to take effect. Try rebooting it and try

Generate key files to connect to Bitbucket in Vagrant boxes

We use Vagrant boxes for development. For every project or small snippet we simply start a new box and provision it with Ansible. This is working fantastic; however, we do get into trouble when connecting to a private Bitbucket repository within a bower install run.
The solution we have now is to generate a new key (ssh-keygen), accept all defaults (pressing <return>, <return>, <return>) and then grab the public key (cat ~/.ssh/id_rsa.pub). Copy it, go to Bitbucket, view your account and add this new ssh key. And repeat for every new box you instantiate.
We have to do this because of some closed source packages (hosted on Bitbucket) we install via Bower. We do have another experience, which is much better: composer (php's package manager) and private Github repositories. With that setup, you have to enter your username/password/2fa token via the command line and an OAuth token is generated for you. This works great.
So, is there a way we can mitigate this bower/bitbucket/ssh issue? For obvious reasons I don't want to provision the boxes with a standard private key, but there has to be another solution?
While I'm not sure that my situation is as complex as yours (I'm not using Ansible or Bower), I solved this problem by using the Vagrant ssh forward agent. This blog post gives you the details on how to get it working:
Cloning from GitHub in Vagrant using SSH agent forwarding
So as long as each of the developers has access on their local machines to the bitbucket repos, it should work.

(EC2) Launch Windows instance programmatically via command line

I'd like to launch a Windows 2008 (64bits, base install) instance programmatically, kinda like clicking on the Launch Instance link & following the "Create a New Instance" wizard.
I read about this command ec2-run-instances, I tried running it on putty using this syntax:
/opt/aws/bin/ec2-run-instances ami_id ami-e5784391 -n 1
--availability-zone eu-west-1a --region eu-west-1 --instance-type m1.small --private-key /full/path/MyPrivateKey.pem --group MyRDP
but it always complain that:
Required option '-C, --cert CERT' missing (-h for usage)
According to the documentation, this option isn't required!!
Can someone tell me what's wrong anyway? I'm just trying to programmatically launch a fresh Windows install, run some tests on the clouds & shut it down after that.
The error message is correct (just try adding --cert ;) - to what documentation are you referring here?
The requirement is clearly outlined in the Microsoft Windows Guide for Amazon EC2, specifically in Task 4: Set the EC2_PRIVATE_KEY and EC2_CERT Environment Variables:
The command line tools need access to an X.509 certificate and a
corresponding private key that are associated with your account. [...]
You can either specify your credentials with the --private-key and
--cert parameters every time you issue a command or you can create environment variables that point to the credential files on your local
system. If the environment variables are properly configured, you can
omit the parameters when you issue a command.
[emphasis mine]
Maybe the option of using environment variables has been misleading somehow somewhere?
Alternative
Please note that you can ease and speed up working with EC2 considerably by using alternate scripting environments covering the same ground, in particular the excellent boto, which is a Python package that provides interfaces to Amazon Web Services.
Boto uses the nowadays more common authentication scheme based on access keys only rather than X.509 certificates (e.g. an AWS_ACCESS_KEY_ID / AWS_SECRET_ACCESS_KEY pair), which furthermore can (and should) be managed via AWS Identity and Access Management (IAM) to avoid the risk of exposing your main AWS account credentials in the first place. See my answer to How to download an EC2 X.509 certificate with an IAM User account? for more details on this.
Good luck!