X509Store - How to list certificates located in a USB key - usb

In a web application (aspx/C#) that will sign documents, how can I list the certificates located on the user USB key (authentication / signing key) ?
Here is my code :
using System;
using System.Collections.Generic;
using System.Linq;
using System.Security.Cryptography.X509Certificates;
using System.Web;
using System.Web.UI;
using System.Web.UI.WebControls;
namespace Signature1
{
public partial class signature : System.Web.UI.Page
{
string strTxt = "Certificates : ";
protected void Page_Load(object sender, EventArgs e)
{
X509Store store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
store.Open(OpenFlags.ReadOnly);
foreach (X509Certificate2 cert in store.Certificates)
{
strTxt += "\nDélivered to " + cert.SubjectName.Name + " by " + cert.IssuerName.Name;
}
store.Close();
myTextbox.Text = strTxt ;
}
}
}
This code works fine on the local machine (debug mode) but returns an empty list when published on an application server.
Thank you for your help.

If you need to read ceritificates from the device located on the server:
On an application server your code works under service account (most likely). In this case CurrentUser store (which you are referencing with StoreLocation.CurrentUser parameter) will be empty.
Now it depends on the driver of your hardware how it maps the certificates from the USB token. If it can map certificates to LocalMachine store, then you can modify your code to enumerate certificates from LocalMachine. If the driver only maps them to current user, then most likely you will need to run your code under that user account. It's possible to impersonate as user (or see this SO question) in Windows so you can switch to specific user account for just one thread.
One more alternative is to access the device via PKCS#11 interface (if the corresponding driver DLL is provided by the hardware vendor and if you have rights to put it to the server system). In this case you login to the hardware in code and it doesn't care about the user. PKCS11 interface is very different from X509Store though and requires third-party libraries (such as our SecureBlackbox) to work with. But this can appear to be the only option in some cases.
If you need to read certificates from the device located on the remote client:
The only option is have a client-side module (most often it's Java applet) which will have access to the device. Java applets can work with PKCS#11 and with Windows Certificate Storage on Windows.
The downside is that Java applets don't work on mobile platforms, on which your only option would be a client application of some kind (so far this problem has no good general solution).

Related

How to add public key identity from String?

I have a Spring Boot application with Apache SSHD. The application should use SSH Public Key Authentication. Therefore, the application needs a private key. How to provide this private key?
For security reasons, the private key should not be saved in the
source code (in Git)
classpath (in JAR)
image (in Docker Registry)
host/volume (with Docker Mount)
Instead the private key should be provided as an environment variable (with GitLab).
Documentation
In the documentation is only an example for private keys saved in the filesystem, see Loading key files:
Loading key files
In order to use password-less authentication the user needs to provide one or more KeyPair-s that are used to "prove" the client's identity for the server. The code supports most if not all of the currently used key file formats. See SshKeyDumpMain class for example of how to load files - basically:
KeyPairResourceLoader loader = SecurityUtils.getKeyPairResourceParser();
Collection<KeyPair> keys = loader.loadKeyPairs(null, filePath, passwordProvider);
Research
I could create the the KeyPair as described in create java PrivateKey and PublicKey from a String of file, but then I would reimplement an existing part of Apache SSHD. I have to support all of the currently used key file formats.
Question
How to load private key from String instead of filesystem?
I found a way to use a String instead of a file, see KeyPairResourceLoader#loadKeyPairs:
default Collection<KeyPair> loadKeyPairs(SessionContext session,
NamedResource resourceKey,
FilePasswordProvider passwordProvider,
String data)
throws IOException,
GeneralSecurityException
Throws:
IOException
GeneralSecurityException
My changed code:
KeyPairResourceLoader loader = SecurityUtils.getKeyPairResourceParser();
Collection<KeyPair> keyPairCollection = loader.loadKeyPairs(null, null, null, pem);

How are people authenticated in their ASP.NET Core Web APIs on Ubuntu/Docker given the bug described below?

I have come across what I think is a bug preventing me from loading an X509Certificate2 on Ubuntu or the Debian-based docker image provided by Microsoft. This means that I can't initialise JwtAuthentication in my web API on these platforms, and I'm looking for help:
Are you successfully using JwtAuthentication on Linux?
If so, how are you initialising an X509Certificate for the JwtBearerOptions?
Can you see a problem with what I'm doing, or suggest a work-around or solution?
I have logged the issue with the corefx team and you can see the full discussion here, but below is the main description of the problem:
I have a Web API running in a docker container. The container is built from the provided 1.1.0 package:
FROM microsoft/aspnetcore:1.1.0
and the Web API binaries are copied in. The API runs fine and returns data as expected until I turn on authentication, at which point it needs an X509SecurityKey to set the TokenValidationParameters.IssuerSigningKey value. It throws an exception when it attempts to initialise an X509Certificate2 from a string value:
string certValue = certificate.Value;
byte[] byteCert = Encoding.ASCII.GetBytes(certValue);
return new X509Certificate2(byteCert);
throws an OpenSslCryptographicException:
Unhandled Exception: System.Exception: Failed to extract the Token Signing certificate from the Federation metadata. --->
Interop+Crypto+OpenSslCryptographicException: error:0D07803A:asn1 encoding routines:ASN1_ITEM_EX_D2I:nested asn1 error
at Internal.Cryptography.Pal.CertificatePal.FromBlob(Byte[] rawData, String password, X509KeyStorageFlags keyStorageFlags)
at System.Security.Cryptography.X509Certificates.X509Certificate..ctor(Byte[] data)
at Mercury.Shared.Rest.Authentication.AdfsFederationMetadata.GetSigningCertificate()
The string value from which the X509Certificate2 is being initialised is:
MIIC4jCCAcqgAwIBAgIQHWt3kGySgJxPtsalC0EoKzANBgkqhkiG9w0BAQsFADAtMSswKQYDVQQDEyJBREZTIFNpZ25pbmcgLSBzdHMuYWxsYW5ncmF5LmNvLnphMB4XDTE2MDkwNzA5MDQyM1oXDTE3MDkwNzA5MDQyM1owLTErMCkGA1UEAxMiQURGUyBTaWduaW5nIC0gc3RzLmFsbGFuZ3JheS5jby56YTCCASIwDQYJKoZIhvcNAQEBBQADggEPADCCAQoCggEBANdq9BEuBRPsTdpngeFyXbfH5lBg5WyENQW0qz2FtDw3AvZhiPdFyvTPZIeZDc4vhg+gPuG8pCxhFa6hPqNIwnLSVuyhUi4/CtZrLghF2wVVcyriijvirzdVp2m56nO31NB5HXbSerTmey1gJsgumr+MiaM2CEI9z5ctwAp66jqM9jVv7kzqIwB33irSck+X97jUa9XVa0/0QPBdrSVUR0i4rmfZ9orRdTKC3IA13bD9duk2Kc9V7t8t/woo80Kbbb3ZseYk5N8AI+7RRw9+oSAm8zZQzBYkNkAMeI1mto1faXsm9Aea4HXbyCbvVOx/JGj5Ki7YK/BtzWAyCgRu0TkCAwEAATANBgkqhkiG9w0BAQsFAAOCAQEAd9rdJ1V9te8njMHiEuvr1DLVUBt2YdpZ8qGFC2rWDKeEW8iARrMfbtrlQovvE1radxoZ2gWR1AaaoEqzmLvF9fEAZ7xjR2P2UlT+ZgntfMn/c+3j7gWnfGNHIjosaYiyF72q4k6bgOx1TV8X08kD2AMBH27utXeeUQTZTd0bUWaWpr76NrDB95k4P6d0t5bkYsguiQjV+2t5/dSvrbTPVbXQmWGyC53IS2OI37AI2bIaeCCDkRHDoxu+L/DtgH8N60k2CLfa+pf0/cxQCR39p4Z+tquVKfYgJIsdZLD6bbrqK9VdpSR2vyUcDLMTGnO0tuDuzBd/xdhJ0GKbnBv3+g==
The same code runs with no problem on Windows, building a certificate from the same string.
Edit: Note that while I initially encountered this problem running a docker image, subsequent testing has shown that it also occurs using Ubuntu 14.04 + .NET Core 1.1
The problem here is that what is being passed to the constructor are the bytes of the base64 representation of the key, and not the bytes of the key itself.
If this code works on Windows then maybe it's a good idea to create an issue in the .net core github referencing this problem.
Thanks for the answer. For those who would like to copy and paste:
var certificateWithoutHeaderAndFooter = certificateString
.Replace("\\n","")
.Replace("-----BEGIN CERTIFICATE-----", "")
.Replace("-----END CERTIFICATE-----", "");
var certificateBytes = Convert.FromBase64String(certificateWithoutHeaderAndFooter);
var certificate = new X509Certificate2(certificateBase64Bytes);

"Authentication not supported": jgit error when trying to clone tfs hosted git repo

When I try to clone a tfs hosted git repo http://tfstta.com:8080/tfs/DefaultCollection/_git/SampleTFSGit from my linux machine, I face the Authentication not supported error:
org.eclipse.jgit.api.errors.TransportException: http://:#tfstta.int.thomson.com:8080/tfs/DefaultCollection/_git/SampleTFSGit.git: authentication not supported*
Enabling basic authentication/alternate credentials does not seem to be an option.
Could someone please tell me a work around for this? I would be very grateful!
Goto Eclipse Menu
Window -> Preferences -> Team -> Git -> right side panel update time to 3000. `Connection timeout (seconds): 3000. Click on Apply and Close button. Clone again it will solve your problem.
This issue happens because JGit doesn't fully support NTLM, and instead of falling back to Basic auth or something else, it will stop right there.
Usually TFS answers failed authentication with multiple WWW-Authenticate headers. What happens here is that there is a bug in JGit's org.eclipse.jgit.transport.http.apache.HttpClientConnection, that will take into consideration only the last of the WWW-Authenticate headers, making it give up before even trying other connection types.
What I suggest is use your own implementation of org.eclipse.jgit.transport.http.HttpConnection, implementing like this:
#Override
public Map<String, List<String>> getHeaderFields() {
Map<String, List<String>> ret = new HashMap<>();
for (Header hdr : resp.getAllHeaders()) {
List<String> list;
if(ret.containsKey(hdr.getName())) list = ret.get(hdr.getName());
else { list = new LinkedList<>(); ret.put(hdr.getName(), list); }
for (HeaderElement hdrElem : hdr.getElements())
list.add(hdrElem.toString());
}
return ret;
}
Or if you are lazy (like me), you can just switch to org.eclipse.jgit.transport.http.JDKHttpConnection and be happy because it uses native Java connection underneath, that works correctly.
If you are trying to use Spring Cloud Config Server with a TFS Git Repository, my choice is just to implement your own ConfigurableHttpConnectionFactory
/**
* This will use native Java connections, instead of crappy ecplise implementation.
* There will be no management of implementation though. I cannot assure
*/
public class SpringJDKConnectionFactory extends JDKHttpConnectionFactory implements ConfigurableHttpConnectionFactory {
#Override
public void addConfiguration(MultipleJGitEnvironmentProperties environmentProperties) {
}
}
And have a configuration loading over the Spring's default:
#Configuration
public class JGitConnectionFactoryConfiguration {
#Bean
#Primary
public ConfigurableHttpConnectionFactory configurableHttpConnectionFactory() {
return new SpringJDKConnectionFactory();
}
}
But beware, TFS will probably not like Basic auth with direct passwords. So create a "Personal Access Token" in TFS, and use that as a password instead.
Simple sample code:
public static void main(String[] args) throws GitAPIException, IOException {
CloneCommand cmd;
String url = "http://tfs-url.com/Git-Repo";
File file = new File("build/git_test");
if(file.exists())
FileUtils.delete(file,FileUtils.RECURSIVE);
cmd = new CloneCommand();
cmd.setDirectory(file);
cmd.setURI(url);
//#use Personal access tokens as basic auth only accepts these
cmd.setCredentialsProvider(new UsernamePasswordCredentialsProvider("UserAccount","personalaccesstoken"));
ConfigurableHttpConnectionFactory cf = new SpringJDKConnectionFactory();
HttpTransport.setConnectionFactory(cf);
Git git = cmd.call();
}
You might want to try https://www.visualstudio.com/en-us/products/team-explorer-everywhere-vs.aspx since it is Microsoft's cross-platform TFS command-line. The code is posted on GitHub if you want to try and patch the authentication helpers back to jGit.
I would recommend you to upgrade your TFS server to the latest Update 3 and then use SSH Authentication for Git Repository.
SSH Support for Git Repos
With TFS 2015 Update 3, you can now connect to any Team Foundation
Server Git repo using an SSH key. This is very helpful if you develop
on Linux or Mac. Just upload your personal SSH key and you're ready to
go.
I have faced this issue with a new pc (configured by someone else). Fixed error with reinstalling JDK and running eclipse with it.
I used a bad approach but for initial work, it's fine for me.
In my case, I switched Project visibility on gitlab from private to public. Go to Gitab -> <your project> -> Settings -> General -> Visibility, project features, permissions -> switch to Public
In application.properties I added only spring.cloud.config.server.git.uri without authentication properties and also at the end of the gitlab uri added .git
http://gitlab.com/<your-repo-name>.git
I don't recommend this approach for people who work tasks for the company.
When you use command below, you'll be prompted to enter the username and password.
git-clone http://:#tfstta.int.thomson.com:8080/tfs/DefaultCollection/_git/SampleTFSGit
In my test, when send the command, you'll be prompted a Windows Security. It's not needed to use basic authentication/alternate credentials, simply type domaine\username and the password will connect to TFS.

How do I get FiddlerCore programmatic Certificate Installation to 'stick'?

I'm using FiddlerCore to capture HTTP requests. Everything is working including SSL Captures as long as the Fiddler certificate is manually installed. I've been using manual installation through Fiddler's Options menu and that works fine.
However, if I use the FiddlerCore provided CertMaker class static methods to add the Fiddler certificate I find that I can use the certificate added to the cert root only in the current session. As soon as I shut down the application and start back up, CertMaker.rootCertExists() returns false.
I use the following code to install the certificate for the current user (from an explicit menu option at this point):
public static bool InstallCertificate()
{
if (!CertMaker.rootCertExists())
{
if (!CertMaker.createRootCert())
return false;
if (!CertMaker.trustRootCert())
return false;
}
return true;
}
The cert gets installed and I see it in the root cert store for the current user. If I capture SSL requests in the currently running application it works fine.
However, if I shut down the running exe, restart and call CertMaker.certRootExists() it returns false and if I try to capture SSL requests the SSL connection fails in the browser. If I recreate the cert and then re-run the requests in the browser while the app stays running it works again. I now end up with two certificates in the root store.
After exiting and relaunching certMaker.certRootExists() again returns false. Only way to get it to work is to register the cert - per exe session.
What am I doing wrong to cause the installation to not stick between execution of the same application?
I was able to solve this problem and create persistent certificates that are usable across EXE sessions, by removing the default CertMaker.dll and BcMakeCert.dll assemblies that FiddlerCore installs and using and distributing the makecert.exe executable instead.
makecert.exe appears to create certificates in such a way that they are usable across multiple runs of the an application, where the included assemblies are valid only for the current application's running session.
Update:
If you want to use the CertMaker.dll and BcMakeCert.dll that FiddlerCore installs by default, you have to effectively cache and set the certificate and private key, using Fiddlers internal preferences object. There are a couple of keys that hold the certificate after it's been created and you need to capture these values, and write them into some sort of configuration storage.
In the following example I have a static configuration object that holds the certificate and key (persisted to a config file when the app shuts down):
public static bool InstallCertificate()
{
if (!CertMaker.rootCertExists())
{
if (!CertMaker.createRootCert())
return false;
if (!CertMaker.trustRootCert())
return false;
// persist Fiddlers certificate into app specific config
App.Configuration.UrlCapture.Cert =
FiddlerApplication.Prefs.GetStringPref("fiddler.certmaker.bc.cert", null);
App.Configuration.UrlCapture.Key =
FiddlerApplication.Prefs.GetStringPref("fiddler.certmaker.bc.key", null);
}
return true;
}
public static bool UninstallCertificate()
{
if (CertMaker.rootCertExists())
{
if (!CertMaker.removeFiddlerGeneratedCerts(true))
return false;
}
// persist Fiddlers certificate into app specific config
App.Configuration.UrlCapture.Cert = null;
App.Configuration.UrlCapture.Key = null;
return true;
}
After installing a certificate this code captures the certificate and private key into the configuration object which persists that value later. For uninstallation, the values are cleared.
At the beginning of the application or the beginning of the capture process, prior to calling CertMaker.rootCertExists() the keys are set from the configuration values. I do this at the beginning of my capture form:
public FiddlerCapture()
{
InitializeComponent();
// read previously saved Fiddler certificate from app specific config
if (!string.IsNullOrEmpty(App.Configuration.UrlCapture.Cert))
{
FiddlerApplication.Prefs.SetStringPref("fiddler.certmaker.bc.key",
App.Configuration.UrlCapture.Key);
FiddlerApplication.Prefs.SetStringPref("fiddler.certmaker.bc.cert",
App.Configuration.UrlCapture.Cert);
}
}
Using this mechanism for saving and then setting the capture settings makes the certificates persist across multiple EXE sessions when using CertMaker.dll.
More detailed info is available this detailed blog post on FiddlerCore.
If anyone is still interested, I found an easier solution based on the demo that Fiddler provides. This demo simply calls CertMaker.trustRootCert(), and strangely enough, it sticks! The first time it will ask whether you want to install the certificate, but after that, the function just returns true and will not cause the pop-up to show.
Unlike your and mine original program, the certificate sticks without having to go to the trouble of letting it stick yourself, so I analysed the differences with the demo. One of the differences I noticed was that the demo didn't have a reference to CertMaker.dll and BCMakeCert.dll. After removing these references from my own solution, I got the same behaviour as the demo.
Unfortunately, I don't have an explanation to why this works, but I hope this still helps some people.

Exchange Web Services authentication problem against Office 365

I'm in the process of developing my first Orchard CMS module, which will interface with Exchange Server for the purpose of adding Exchange Task functionality to Orchard (basically providing web management of personal Tasks). Unfortunately, I don't think Office 365 supports the type of authentication required. This Microsoft document outlines some instructions on setting up a service account with impersonation rights, in order to use Exchange Web Services.
Unfortunately, I need to be able to run the "New-ManagementRoleAssignment" cmdlet, in order to assign the impersonation rights. The error I'm receiving when attempting this cmdlet is:
The term 'New-ManagementRoleAssignment' is not recognized as the name of a cmdlet, function, script file, or operable program.
I'm definetely connected properly, as instructed in that previous URL. Everything I'm reading suggests that this command should be available. Am I missing something? I'm using the Enterprise version of Office 365, in case that matters. The account that I'm using to log in with PowerShell is my global admin account.
Any help and/or insight would be very much appreciated! I have a support in with Microsoft as well, so I'll post anything I get back from them.
Vito
[EDIT]
I've decided to add some code, for those who have an Exchange Server and are interested in trying this out. You'll have to download the Exchange Web Services dll, in order to make use of the namespace Microsoft.Exchange.WebServices.
using Microsoft.Exchange.WebServices.Data;
using Microsoft.Exchange.WebServices.Autodiscover;
private static ExchangeService _service;
private static void ConnectToExchangeService()
{
_service = new ExchangeService(ExchangeVersion.Exchange2010_SP1);
_service.TraceEnabled = true;
_service.Credentials = new System.Net.NetworkCredential("me#domain.com", "password");
AutodiscoverService ads = new AutodiscoverService();
ads.EnableScpLookup = false;
ads.RedirectionUrlValidationCallback = delegate { return true; };
GetUserSettingsResponse grResp = ads.GetUserSettings("me#domain.com", UserSettingName.ExternalEwsUrl);
Uri casURI = new Uri(grResp.Settings[UserSettingName.ExternalEwsUrl].ToString());
_service.Url = casURI;
ControllerContext ctx = new ControllerContext();
ctx.HttpContext.Response.Write("Server Info: " + _service.ServerInfo.VersionString);
ctx.HttpContext.Response.Flush();
}
AFAIK, the cmdlet New-ManagementRoleAssignment is not available for the Small Business Plan (P1) on Office 365. However, the administrator is assigned impersonation rights by default so you have to connect with the administrator credentials.