Send certificate in http request header - c++ - authentication

I have a certificate that I need to send in the header of an http request. This is how I acquired the cert:
PCCERT_CONTEXT cert = nullptr;
wstring store = // store name
wstring subjectName = // subject name
HCERTSTORE hStoreHandle = CertOpenStore(
CERT_STORE_PROV_SYSTEM,
0,
NULL,
CERT_SYSTEM_STORE_CURRENT_USER,
store.c_str());
cert = CertFindCertificateInStore(
hStoreHandle,
X509_ASN_ENCODING,
0,
CERT_FIND_SUBJECT_STR,
subjectName.c_str(),
NULL);
I need to send it as a custom header, as the load balancer that sits in front of my service strips off the certificate header ["X-ARR-CLIENTCERT"] before forwarding the request. I believe I need to send the cert->pbCertEncoded, but on the server, I can't decode it and convert it back to an X509Certificate2.
This is what I tried on the client:
request.headers().add("client-cert", cert->pbCertEncoded);
On the server:
var headerCert = Request.Headers["client-cert"];
byte[] certdata = Convert.FromBase64String(headerCert);
X509Certificate2 cert = new X509Certificate2(certdata);
The request header on the server is non-null. But it cannot parse it back to an X509Certificate2.
I tried another thing on the client. After getting the cert, I converted it to a string
DWORD size = 0;
CryptBinaryToString(cert->pbCertEncoded, cert->cbCertEncoded, CRYPT_STRING_BASE64, NULL, &size);
LPWSTR outstring = new TCHAR[size];
CryptBinaryToString(cert->pbCertEncoded, cert->cbCertEncoded, CRYPT_STRING_BASE64, outstring, &size);
If I try to send outstring in the header, it complains:
WinHttpAddRequestHeaders: 87: The parameter is incorrect.
But when I take the contents of outstring and try to parse it on the server, it decodes back to the right certificate. This tells me that I'm not doing something right when passing cert->pbCertEncoded in the header. Maybe I need to re-encode it or transform it somehow so the server can correctly parse it? I'd appreciate any help. Thanks!
My client is in c++ and server in .NET. I'm using cpprestsdk to send the certificate in the http request.

The pbCertEncoded is the ASN.1 encoded representation of the certificate. Look for instance here.
So you must encode the bytes to base64 for instance like this:
#include <Wincrypt.h>
#pragma comment (lib, "Crypt32.lib")
int ToBase64Crypto(const BYTE* pSrc, int nLenSrc, char* pDst, int nLenDst )
{
DWORD nLenOut = nLenDst;
BOOL fRet = CryptBinaryToString(
(const BYTE*)pSrc,
nLenSrc,
CRYPT_STRING_BASE64 | CRYPT_STRING_NOCRLF,
pDst, &nLenOut
);
if (!fRet) {
nLenOut=0; // failed
}
return (nLenOut);
}

Related

Validate EC SHA 256 signature in .net without bouncy castle

I am implementing Apple's App Attestation service.
As part of the process, i receive a EC key and a signature.
Sample key:
-----BEGIN PUBLIC KEY-----
MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEd34IR9wYL76jLyZ148O/hjXo9iaF
z/q/xEMXCwYPy6yxbxYzWDZPegG4FH+snXaXQPYD6QIzZNY/kcMjIGtUTg==
-----END PUBLIC KEY-----
Sample signature:
MEUCIQDXR/22YAi90PUdKrtTHwigrDxWFoiCqPLB/Of1bZPCKQIgNLxFAeUU2x+FSWfhRGX0SOKUIDxPRoigsCHpJxgGXXU=
Sample sha256 hash:
S3i6LAEzew5SDjQbq59/FraEAvGDg9y7fRIfbnhHPf4=
If i put this into a couple of txt files like so:
System.IO.File.WriteAllBytes("/wherever/sig", Convert.FromBase64String(sampleSignature));
System.IO.File.WriteAllBytes("/wherever/hash", Convert.FromBase64String(sampleSha256Hash));
Then i can validate the signature with Openssl like so
openssl dgst -sha256 -verify sampleKey.pem -signature /wherever/sig /wherever/hash
(the above outputs)
Verified OK
I can verify the signature using Bouncy Castle like so:
var bouncyCert = DotNetUtilities.FromX509Certificate(certificate);
var bouncyPk = (ECPublicKeyParameters)bouncyCert.GetPublicKey();
var verifier = SignerUtilities.GetSigner("SHA-256withECDSA");
verifier.Init(false, bouncyPk);
verifier.BlockUpdate(sha256HashByteArray, 0, sha256HashByteArray.Length);
var valid = verifier.VerifySignature(signature); // Happy days, this is true
Since i don't want to share my whole certificate here, the same sample may be achieved as follows:
// these are the values from the sample key shared at the start of the post
// as returned by BC. Note that .Net's Y byte array is completely different.
Org.BouncyCastle.Math.BigInteger x = new Org.BouncyCastle.Math.BigInteger(Convert.FromBase64String("d34IR9wYL76jLyZ148O/hjXo9iaFz/q/xEMXCwYPy6w="));
Org.BouncyCastle.Math.BigInteger y = new Org.BouncyCastle.Math.BigInteger(Convert.FromBase64String("ALFvFjNYNk96AbgUf6yddpdA9gPpAjNk1j+RwyMga1RO"));
X9ECParameters nistParams = NistNamedCurves.GetByName("P-256");
ECDomainParameters domainParameters = new ECDomainParameters(nistParams.Curve, nistParams.G, nistParams.N, nistParams.H, nistParams.GetSeed());
var G = nistParams.G;
Org.BouncyCastle.Math.EC.ECCurve curve = nistParams.Curve;
Org.BouncyCastle.Math.EC.ECPoint q = curve.CreatePoint(x, y);
ECPublicKeyParameters pubkeyParam = new ECPublicKeyParameters(q, domainParameters);
var verifier = SignerUtilities.GetSigner("SHA-256withECDSA");
verifier.Init(false, pubkeyParam);
verifier.BlockUpdate(sha256HashByteArray, 0, sha256HashByteArray.Length);
var valid = verifier.VerifySignature(signature); // again, happy days.
However, i really want to avoid using bouncy castle.
So i am trying to use ECDsa available in .net core:
using System.Security.Cryptography;
using System.Security.Cryptography.X509Certificates;
var certificate = new X509Certificate2(cert);
var publicKey = certificate.GetECDsaPublicKey();
var valid = publicKey.VerifyHash(sha256HashByteArray, signature); // FALSE :(
if you want to try to run the above here's the sample that creates the keys without the whole certificate:
using System.Security.Cryptography;
var ecParams = new ECParameters();
ecParams.Curve = ECCurve.CreateFromValue("1.2.840.10045.3.1.7");
ecParams.Q.X = Convert.FromBase64String("d34IR9wYL76jLyZ148O/hjXo9iaFz/q/xEMXCwYPy6w=");
// I KNOW that this is different from BC sample - i got each respective values from
// certificates in respective libraries, and it seems the way they format the coordinates
// are different.
ecParams.Q.Y = Convert.FromBase64String("sW8WM1g2T3oBuBR/rJ12l0D2A+kCM2TWP5HDIyBrVE4=");
var ecDsa = ECDsa.Create(ecParams);
var isValid = ecDsa.VerifyHash(nonce, signature); // FALSE :(
I tried using VerifyData() instead and feeding raw data and HashAlgorithmName.SHA256 with no luck.
I found a response here (https://stackoverflow.com/a/49449863/2057955) that seems to suggest that .net expects the signature as r,s concatenation, so i pulled them out of the DER sequence that i get back from my device (see sample signature) however that had no luck at all, i just can't get that 'true' back.
Question: how can i verify this EC signature using .Net Core on LINUX/MacOs (so unable to use ECDsaCng class)?
SignerUtilities.GetSigner() hashes implicitly, i.e. sha256HashByteArray is hashed again. Therefore instead of ECDsa#VerifyHash() (does not hash implicitly) the method ECDsa#VerifyData() (hashes implicitly) must be used.
Also, SignerUtilities.GetSigner() returns a signature in ASN.1 format, and ECDsa#VerifyData() expects a signature in r|s format (as you already figured out).
If both are taken into account, the verification is successful:
byte[] publicKey = Convert.FromBase64String("MFkwEwYHKoZIzj0CAQYIKoZIzj0DAQcDQgAEd34IR9wYL76jLyZ148O/hjXo9iaFz/q/xEMXCwYPy6yxbxYzWDZPegG4FH+snXaXQPYD6QIzZNY/kcMjIGtUTg==");
byte[] sha256HashByteArray = Convert.FromBase64String("S3i6LAEzew5SDjQbq59/FraEAvGDg9y7fRIfbnhHPf4=");
byte[] signatureRS = Convert.FromBase64String("10f9tmAIvdD1HSq7Ux8IoKw8VhaIgqjywfzn9W2Twik0vEUB5RTbH4VJZ+FEZfRI4pQgPE9GiKCwIeknGAZddQ==");
var ecDsa = ECDsa.Create();
ecDsa.ImportSubjectPublicKeyInfo(publicKey, out _);
var isValid = ecDsa.VerifyData(sha256HashByteArray, signatureRS, HashAlgorithmName.SHA256);
Console.WriteLine(isValid); // True
Regarding the signature formats:
The posted signature in ASN.1 format
MEUCIQDXR/22YAi90PUdKrtTHwigrDxWFoiCqPLB/Of1bZPCKQIgNLxFAeUU2x+FSWfhRGX0SOKUIDxPRoigsCHpJxgGXXU=
is hex encoded
3045022100d747fdb66008bdd0f51d2abb531f08a0ac3c56168882a8f2c1fce7f56d93c229022034bc4501e514db1f854967e14465f448e294203c4f4688a0b021e92718065d75
From this, the signature in r|s format can be derived as (s. here)
d747fdb66008bdd0f51d2abb531f08a0ac3c56168882a8f2c1fce7f56d93c22934bc4501e514db1f854967e14465f448e294203c4f4688a0b021e92718065d75
or Base64 encoded:
10f9tmAIvdD1HSq7Ux8IoKw8VhaIgqjywfzn9W2Twik0vEUB5RTbH4VJZ+FEZfRI4pQgPE9GiKCwIeknGAZddQ==

TLS 1.2 ECDHE_RSA signature

I'm currently working on a Java TLS server. I'm trying to get the following CipherSuite to work : TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA
When I test it using openssl s_client I get the following error after the ServerKeyExchange message :
140735242416208:error:1414D172:SSL
routines:tls12_check_peer_sigalg:wrong signature type:t1_lib.c:1130:
Here is the TLS message as seen in Wireshark
The Handshake fails on a decode_error fatal error.
So I guess the client doesn't like the signature algorithm chosen.
But I am only using the default SignatureAndHashAlgorithm for now as per RFC 5246 Section-7.4.1.4.1
If the negotiated key exchange algorithm is one of (RSA, DHE_RSA,
DH_RSA, RSA_PSK, ECDH_RSA, ECDHE_RSA), behave as if client had sent
the value {sha1,rsa}.
(I'm still checking if the client do offer theses default values though)
Since I'm doing ECDHE_RSA I believe I should hash and sign the serverECDHparams as per RFC 4492 Section 5.4 (First post here so only 2 links sorry :) )
ServerKeyExchange.signed_params.sha_hash
SHA(ClientHello.random + ServerHello.random +
ServerKeyExchange.params);
struct {
select (KeyExchangeAlgorithm) {
case ec_diffie_hellman:
ServerECDHParams params;
Signature signed_params;
};
} ServerKeyExchange;
And I should do this as per RFC 2246 Section 7.4.3
select (SignatureAlgorithm) {
case rsa:
digitally-signed struct {
opaque md5_hash[16];
opaque sha_hash[20];
};
} Signature;
md5_hash
MD5(ClientHello.random + ServerHello.random + ServerParams);
sha_hash
SHA(ClientHello.random + ServerHello.random + ServerParams);
My Java code regarding signing the serverParams :
private byte[] getSignedParams(ChannelBuffer params)
throws NoSuchAlgorithmException, DigestException,
SignatureException, InvalidKeyException {
byte[] signedParams = null;
ChannelBuffer signAlg = ChannelBuffers.buffer(2);
MessageDigest md5 = MessageDigest.getInstance("MD5");
MessageDigest sha = MessageDigest.getInstance("SHA-1");
switch (session.cipherSuite.sign) {
case rsa:
signAlg.writeByte(2); // 2 for SHA1
sha.update(clientRandom);
sha.update(serverRandom);
sha.update(params.toByteBuffer());
md5.update(clientRandom);
md5.update(serverRandom);
md5.update(params.toByteBuffer());
signedParams = concat(md5.digest(), sha.digest());
break;
}
signAlg.writeByte(session.cipherSuite.sign.value); // for RSA he byte is one
ChannelBuffer signLength = ChannelBuffers.buffer(2);
signLength.writeShort(signedParams.length);
return concat(signAlg.array(),concat(signLength.array(),signedParams));
}
So my question is basically : Am I wrong about all this ? and if so, what am I doing wrong ?
Thank you for your time ! :)
It's me again, I seem to have fixed my particular problem 2 things I noted :
Regarding my Java code, the MessageDigest class only does hashing no signing so I now use the Signature class instead.
It seems I only need to sign using SHA1 in TLS1.2 I don't need to do MD5 at all.
The second item is what I should have found in the RFC but didn't (maybe it is written somewhere, I don't know) I think this could be useful for people even if they're not doing Java ;)
How my code looks now :
private byte[] getSignedParams(ChannelBuffer params)
throws NoSuchAlgorithmException, DigestException,
SignatureException, InvalidKeyException {
byte[] signedParams = null;
Signature signature = Signature.getInstance(selectedSignAndHash.toString());
ChannelBuffer signAlg = ChannelBuffers.buffer(2);
signAlg.writeByte(selectedSignAndHash.hash.value);
signature.initSign(privateKey);
signature.update(clientRandom);
signature.update(serverRandom);
signature.update(params.toByteBuffer());
signedParams = signature.sign();
signAlg.writeByte(session.cipherSuite.sign.value);
ChannelBuffer signLength = ChannelBuffers.buffer(2);
signLength.writeShort(signedParams.length);
return concat(signAlg.array(), concat(signLength.array(), signedParams));
}
The code is different because in between I added a function to choose the SignatureAndHashAlgorithm to use from the list the client provides. But you could modify this to only respond using SHA1withRSA as this seems to be the default HashAndSignatureAlgorithm.

Get the CERT_RDN information from the certificate object

I'm opening the certificate store using the "CertOpenStore" API and get the certificates using the "CertEnumCertificatesInStore" API.
The CERT_CONTEXT data returned by the API gives the issuer name in CERT_NAME_BLOB type.
How to get the CERT_RDN or CERT_NAME_INFO from the certificate.?
My requirement is to get the issuer name attributes (O, OU, etc.). I do not want to parse the string returned by the CertNameToStr API.
The above comment is correct, you do need to decode the ASN.1 encoded data in the CERT_NAME_BLOB. However, the CryptoAPI has a function to do this for you - CryptDecodeObject.
If you have a PCCERT_CONTEXT handle pCertContext, you can decode it to a CERT_NAME_INFO structure as follows:
BOOL success = CryptDecodeObject(
X509_ASN_ENCODING,
X509_NAME,
pCertContext->pCertInfo->Issuer.pbData,
pCertContext->pCertInfo->Issuer.cbData,
0,
NULL,
&dwNameInfoSize);
// (check that CryptDecodeObject succeeded)
PCERT_NAME_INFO pCertNameInfo = (PCERT_NAME_INFO) malloc(dwNameInfoSize);
// (check that malloc succeeded)
CryptDecodeObject(
X509_ASN_ENCODING,
X509_NAME,
pCertContext->pCertInfo->Issuer.pbData,
pCertContext->pCertInfo->Issuer.cbData,
0,
pCertNameInfo,
&dwNameInfoSize);
Now you can loop through the different components of the RDN like this:
for (DWORD i = 0; i < pCertNameInfo->cRDN; i++)
{
for (DWORD j = 0; j < pCertNameInfo->rgRDN[i].cRDNAttr; j++)
{
CERT_RDN_ATTR &rdnAttribute = pCertNameInfo->rgRDN[i].rgRDNAttr[j];
//
// Do stuff with the RDN attribute
//
}
}
With each iteration, rdnAttribute will be set to a different component of the issuer name like you want.
Finally, free the memory when you're done:
free(pCertNameInfo);

Why is my Destination IP Address seen as my Source IP Address when attempting to connect from a Handheld Device?

I am trying to call a REST method on a server from a handheld device with this code:
public static void WriteIt2( string fileName, string data )
{
// "filename" is what the file to save will be named; "data" is the contents of that file
if (File.Exists(fileName))
{
MessageBox.Show(String.Format("{0} exists - deleting", fileName));
File.Delete(fileName);
}
string justFileName = Path.GetFileNameWithoutExtension(fileName);
String uri = String.Format("http://192.168.125.50:21608/api/inventory/sendXML/duckbilled/platypus/{0}", justFileName);
SendXMLFile2(uri, data);
}
public static void SendXMLFile2(string uri, string data)
{
HttpWebRequest req = (HttpWebRequest)WebRequest.Create(uri);
req.Method = "Post";
req.ContentType = "text/plain; charset=utf-8";
byte[] encodedBytes = Encoding.UTF8.GetBytes(data);
req.ContentLength = encodedBytes.Length;
Stream requestStream = req.GetRequestStream();
requestStream.Write(encodedBytes, 0, encodedBytes.Length);
requestStream.Close();
WebResponse result = req.GetResponse();
MessageBox.Show(result.ToString());
}
The breakpoint in my server code is not getting reached; I'm trying to find out why. I'm using RawCap and Wireshark to try to see exactly what's going on. After running RawCap and opening the .pcap file it creates in WireShark, and by then searching for any appearances of the port I'm trying to access (via Edit > Find Packet > Packet Bytes with "21608" as the search string") I found this in the Data for the only packet that contains that string:
SBCMYReportProviderStatusMessage#bgM
%<NetworkShield`http://192.168.125.50:21608/api/inventory/sendXML/duckbilled/platypus/INV_0000003.0916201413022
6z )
...so the code running on the handheld device is being picked up by WireShark, but Wireshark shows 192.168.125.50 as "Source" and 192.168.125.87 as "Destination" (Protocol == TCP, where I would kind of expect it to be HTTP).
192.168.125.50 is my PC's IP Address (should be the Destination, not the Source, right?)
192.168.125.87, the Destination, according to "nbtstat -a 192.168.125.87" is "BUCK, UNIQUE" I don't know or what "BUCK" is...(obviously, a computer on the local network)
The IP Address of the handheld is 192.168.55.101
Why does Wireshark not show 192.168.55.101 as the Source and 192.168.125.50 as the Destination? Is it possible to determine the reason for the failure (the REST method not getting hit) from this Wireshark data?
UPDATE
By right-clicking the Packet record in WireShark and selecting "Follow TCP Stream" I get the following:
SBCM.................w.......R.e.p.o.r.t.P.r.o.v.i.d.e.r.S.t.a.t.u.s.M.e.s.s.a.g.e.#...bg.M.....%.<....F.i.l.e.S.y.s.t.e.m.S.h.i.e.l.d.....l...C.:.\.W.i.n.d.o.w.s.\.a.s.s.e.m.b.l.y.\.N.a.t.i.v.e.I.m.a.g.e.s._.v.2...0...5.0.7.2.7._.3.2.. . . . [ much more of the same type of thing elided ]
.................SBCM.................Y.......R.e.p.o.r.t.P.r.o.v.i.d.e.r.S.t.a.t.u.s.M.e.s.s.a.g.e.#...bg.M.....%.<
...N.e.t.w.o.r.k.S.h.i.e.l.d.....`...h.t.t.p.:././.1.9.2...1.6.8...1.2.5...5.0.:.2.1.6.0.8./.a.p.i./.i.n.v.e.n.t.o.r.y./.s.e.n.d.X.M.L./.d.u.c.k.b.i.l.l.e.d./.p.l.a.t.y.p.u.s./.I.N.V._.0.0.0.0.0.0.3...0.9.1.6.2.0.1.4.1.3.0.2.2.6.....z ......)...SBCM.........................R.e.p.o.r.t.P.r.o.v.i.d.e.r.S.t.a.t.u.s.M.e.s.s.a.g.e.#...bg.M.....%.<....W.e.b.R.e.p........................JSBCM.........................R.e.p.o.r.t.M.a.i.n.S.t.a.t.u.s.M.e.s.s.a.g.e.#...bg.M.....%.<............................$.....O6.........R8e....E.......C.......C....B_.................SBCM.........................R.e.p.o.r.t.P.r.o.v.i.d.e.r.S.t.a.t.u.s.M.e.s.s.a.g.e.#...bg.M.....%.<....W.e.b.R.e.p........................JSBCM.........................R.e.p.o.r.t.M.a.i.n.S.t.a.t.u.s.M.e.s.s.a.g.e.#...bg.M.....%.<............................$.....O6.........R8e....E.......C.......C....B_.................
I cannot make heads or tails of this; I don't know what I should expect to see after my uri...I don't see any "ack" of either success or failure...

Implementing mutual authentication with LDAP API and SSPI

I would like to ask you a question about implementing mutual authentication with Kerberos, using SSPI and LDAP API.
I am using the guidelines described in: ldap_sasl_bind_s(GSSAPI) - What should be provided in the credentials BERVAL structure.
Here is the algorithm I am using:
//--------------------------------------------------------------------------------------------
// client side
AcquireCredentialsHandle(NULL, "Kerberos", SECPKG_CRED_BOTH, NULL, &secIdent, NULL, NULL, &kClientCredential, &kClientTimeOut);
// AcquireCredentialsHandle returns SEC_E_OK
// begin validation
unsigned long ulClientFlags = ISC_REQ_CONNECTION | ISC_REQ_MUTUAL_AUTH | ISC_REQ_DELEGATE;
int iCliStatus = InitializeSecurityContext(&kClientCredential, isContextNull(kClientContext) ? NULL : &kClientContext,
pacTargetName, ulClientFlags, 0, SECURITY_NATIVE_DREP, pkServerToken,
0, &kClientContext, &kClientToken, &ulContextAttr, NULL);
// InitializeSecurityContext returns SEC_I_CONTINUE_NEEDED
//--------------------------------------------------------------------------------------------
// server side
// ldap_init returns ok
ldap_set_option(ld, LDAP_OPT_SIGN, LDAP_OPT_OFF);
ldap_set_option(ld, LDAP_OPT_ENCRYPT, LDAP_OPT_OFF);
unsigned long ulVersion = LDAP_VERSION3;
ldap_set_option(ld, LDAP_OPT_VERSION, &ulVersion);
// ldap_connect returns LDAP_SUCCESS
// build the credentials based on what InitializeSecurityContext returned
BERVAL creds;
creds.bv_len = kClientToken.pBuffers[0].cbBuffer;
creds.bv_val = reinterpret_cast(kClientToken.pBuffers[0].pvBuffer);
BERVAL* pServerCreds = NULL;
int iError = ldap_sasl_bind_s(ld, "", "GSSAPI", &creds, NULL, NULL, &pServerCreds);
// ldap_sasl_bind_s returns LDAP_SUCCESS
unsigned long ulError = 0;
ldap_get_option(ld, LDAP_OPT_ERROR_NUMBER, &ulError);
// ulError is equal to LDAP_SASL_BIND_IN_PROGRESS
And here is the problem: both LDAP error codes are ok, but pServerCreds points to an empty BERVAL structure (not NULL, but bv_len equals to 0), and it should contain the server credential I have to pass to the next InitializeSecurityContext call. If I use that data to build the SecBufferDesc structure for the following call, it returns SEC_E_INVALID_TOKEN.
Is ldap_sasl_bind_s supposed to return an empty BERVAL or am I doing something wrong?
I have tested the authentication using full SSPI calls (AcceptSecurityContext for the server) and it works just as expected. The problem is that I need the server to be cross-platform, so I cannot use SSPI.
Thanks for taking the time to answer!
Juan
I found the problem.
According to this thread there is a bug with ldap_sasl_bind_s returning empty server credentials in Windows XP. I have tested my application under Windows 2008 Server and the credentials are properly returned.