Windows 8 Metro RSA Encryption: AsymmetricKeyAlgorithmProvider ImportPublicKey Fails - wcf

I am attempting to pass some encrypted data between a Win 8 Metro app and a RESTful WCF service. Initially the Metro app requests a public key and the WCF service returns it as a raw Stream as to avoid any pesky formatting issues. The Base 64 encoded public key is decoded in the metro app into a byte array. Here is where the problem occurs. When I attempted to call AsymmetricKeyAlgorithmProvider.ImportPublicKey I get the error "ASN1 bad tag value met".
I am using RSA PKCS1 for the encryption. Here is the relevant code:
WCF Service
string keyName = "This is passed in via a parameter";
var key = !CngKey.Exists(keyName) ? CngKey.Create(CngAlgorithm2.Rsa, keyName) : CngKey.Open(keyName);
// Create the RSA container to get keys and then dispose
using (var rsaCng = new RSACng(key) { EncryptionPaddingMode = AsymmetricPaddingMode.Pkcs1, KeySize = 2048 })
{
byte[] publicBlob = rsaCng.Key.Export(CngKeyBlobFormat.GenericPublicBlob);
publicKey = Convert.ToBase64String(publicBlob);
}
Metro App
public static string Encrypt(IBuffer dataBuffer, string publicKeyString)
{
var asymmAlg = AsymmetricKeyAlgorithmProvider.OpenAlgorithm(AsymmetricAlgorithmNames.RsaPkcs1);
// The next line fails with ASN1 bad tag value met
var publicKey = asymmAlg.ImportPublicKey(CryptographicBuffer.DecodeFromBase64String(publicKeyString), CryptographicPublicKeyBlobType.Pkcs1RsaPublicKey);
var encryptedData = CryptographicEngine.Encrypt(publicKey, dataBuffer, null);
return CryptographicBuffer.EncodeToBase64String(encryptedData);
}
EDIT 1: More information below
Exporting the public key from a 2048bit key pair from the WCF service yields a 283 bit length key blob, while exporting the same type of public key from the Metro app is only 270 bits. When I import the Metro generated public key it succeeds. Any idea why the WCF service has 13 extra bits on its public key? I think those extra 13 bits are causing the failure.
Here is the Metro code that yields the shorter public key blob:
var provider = AsymmetricKeyAlgorithmProvider.OpenAlgorithm(AsymmetricAlgorithmNames.RsaPkcs1);
CryptographicKey standardKeyPair = provider.CreateKeyPair(2048);
byte[] standardKey = standardKeyPair.ExportPublicKey(CryptographicPublicKeyBlobType.Pkcs1RsaPublicKey).ToArray();

Quite late, but maybe it will help you or saves someone's time...
Change the type of blob type during import. It's really wierd, but I had success with it, after experimenting.
Your code in WCF may stay as it is.
Change just the Metro code:
public static string Encrypt(IBuffer dataBuffer, string publicKeyString)
{
var asymmAlg = AsymmetricKeyAlgorithmProvider.OpenAlgorithm(AsymmetricAlgorithmNames.RsaPkcs1);
// The next line fails with ASN1 bad tag value met
var publicKey = asymmAlg.ImportPublicKey(CryptographicBuffer.DecodeFromBase64String(publicKeyString), CryptographicPublicKeyBlobType.BCryptPublicKey);
var encryptedData = CryptographicEngine.Encrypt(publicKey, dataBuffer, null);
return CryptographicBuffer.EncodeToBase64String(encryptedData);
}
So the only change here is the BCryptPublicKey during the importing. Then it works. But do not ask me why :-).

Related

How do I know if my data was decrypted correctly?

Im using the Cipher class in the java.security library to encrypt and decrypt data. Im specifically using RSA encryption to encrypt AES parameters and message byte length, the AES cipher is what I'm using to encrypt the message itself. Currently I have a problem that I can't be sure if the provided encrypted AES parameters and message length were encrypted with my RSA key, or if they were encrypted at all and not just random jiberish.
val rsaCipher = Cipher.getInstance("RSA")
rsaCipher.init(Cipher.DECRYPT_MODE, privateKey)
val rsaDecoded = rsaCipher.doFinal(encryptedAesParametersAndMessageLength)
val aesParameters = ByteArray(48)
val messageLength = ByteArray(4)
for (i in aesParameters.indices) {
aesParameters[i] = rsaDecoded[i]
}
for (i in messageLength.indices) {
messageLength[i] = rsaDecoded[i + aesParameters.size]
}
val length = ByteBuffer.wrap(messageLength).order(ByteOrder.LITTLE_ENDIAN).int
val aesEncryptor = AesEncryptor(aesParameters)
//bytes of the encryptedMessage
//TODO Make sure this doesn't result in an error
val encryptedMessage = ByteArray(length)
When trying to decrypt AES parameters and message length using the RSA key it doesn't throw any kind of exception if the data wasn't decrypted correctly. Eventually this leads to a memory overflow error because the message length is too large. Is it safe to maybe put a 4 bytes(holding some constant int value) after these parameters that i can check to make sure the data was decrypted correctly, or would that risk my apps safety?

How do I form the SSH private key signature in JavaScript?

I am using an in-browser library, SSHy, for SSHing to devices. I'm currently working on adding support for publickey authentication, but I keep getting an error from the server about an invalid signature. I'm able to send the first SSH_MSG_USERAUTH_REQUEST without the signature and get back a SSH_MSG_USERAUTH_PK_OK. But when I send the next message with the signature, I always get a SSH_MSG_USERAUTH_FAILURE.
I'm doing the signing with another library (sshpk-browser) and forming the signature below using SSHy based on the SSH schema.
Can anyone see any potential issues with how I am forming the signature?
const decodedPublicKey = config.privateKey.toPublic().toString('ssh', { hashAlgo: 'sha512' }).split(' ')[1];
const publicKey = atob(decodedPublicKey);
var m = new SSHyClient.Message();
m.add_bytes(String.fromCharCode(SSHyClient.MSG_USERAUTH_REQUEST));
m.add_string(this.termUsername);
m.add_string('ssh-connection');
m.add_string('publickey');
m.add_boolean(true); // has signature
m.add_string('rsa-sha2-512'); // public key algorithm name
m.add_string(publicKey); // public key
// Create signature
var sigMsg = new SSHyClient.Message();
sigMsg.add_string(SSHyClient.kex.sessionId);
sigMsg.add_bytes(String.fromCharCode(SSHyClient.MSG_USERAUTH_REQUEST));
sigMsg.add_string(this.termUsername);
sigMsg.add_string('ssh-connection');
sigMsg.add_string('publickey');
sigMsg.add_boolean(true); // has signature
sigMsg.add_string('rsa-sha2-512');
sigMsg.add_string(publicKey);
const sigMsgString = sigMsg.toString();
// Sign signature
const sign = config.privateKey.createSign('sha512');
sign.update(sigMsgString);
const signature = sign.sign();
m.add_string(atob(signatureToString)); // signature
this.parceler.send(m);

itextsharp signing pdf with signed hash

I'm trying to sign a pdf through a signing service. This service requires to send a hex encoded SHA256 digest and in return I receive a hex encoded signatureValue. Besides that I also receive a signing certificate, intermediate certificate, OCSP response, and TimeStampToken. However, I already get stuck trying to sign the pdf with the signatureValue.
I have read Bruno's white paper, browsed the internet excessively, and tried many different ways, but the signature keeps coming up as invalid.
My latest attempt:
First, prepare pdf
PdfReader reader = new PdfReader(src);
FileStream os = new FileStream(dest, FileMode.Create);
PdfStamper stamper = PdfStamper.CreateSignature(reader, os, '\0');
PdfSignatureAppearance appearance = stamper.SignatureAppearance;
appearance.Certificate = signingCertificate;
IExternalSignatureContainer external = new ExternalBlankSignatureContainer(PdfName.ADOBE_PPKLITE, PdfName.ADBE_PKCS7_DETACHED);
MakeSignature.SignExternalContainer(appearance, external, 8192);
string hashAlgorithm = "SHA-256";
PdfPKCS7 sgn = new PdfPKCS7(null, chain, hashAlgorithm, false);
PdfSignatureAppearance appearance2 = stamper.SignatureAppearance;
Stream stream = appearance2.GetRangeStream();
byte[] hash = DigestAlgorithms.Digest(stream, hashAlgorithm);
byte[] sh = sgn.getAuthenticatedAttributeBytes(hash, null, null, CryptoStandard.CMS);
Hash byte[] sh and convert to string as follows
private static String sha256_hash(Byte[] value)
{
using (SHA256 hash = SHA256.Create())
{
return String.Concat(hash.ComputeHash(value).Select(item => item.ToString("x2"))).ToUpper();
}
}
and send to signing service. The received hex encoded signatureValue I then convert to bytes
private static byte[] StringToByteArray(string hex)
{
return Enumerable.Range(0, hex.Length).Where(x => x % 2 == 0).Select(x => Convert.ToByte(hex.Substring(x, 2), 16)).ToArray();
}
Finally, create signature
private void CreateSignature(string src, string dest, byte[] sig)
{
PdfReader reader = new PdfReader(src); // src is now prepared pdf
FileStream os = new FileStream(dest, FileMode.Create);
IExternalSignatureContainer external = new MyExternalSignatureContainer(sig);
MakeSignature.SignDeferred(reader, "Signature1", os, external);
reader.Close();
os.Close();
}
private class MyExternalSignatureContainer : IExternalSignatureContainer
{
protected byte[] sig;
public MyExternalSignatureContainer(byte[] sig)
{
this.sig = sig;
}
public byte[] Sign(Stream s)
{
return sig;
}
public void ModifySigningDictionary(PdfDictionary signDic) { }
}
What am I doing wrong? Help is very much appreciated. Thanks!
Edit: Current state
Thanks to help from mkl and following Bruno's deferred signing example I've gotten past the invalid signature message. Apparently I don't receive a full chain from the signing service, but just an intermediate certificate, which caused the invalid message. Unfortunately, the signature still has flaws.
I build the chain like this:
List<X509Certificate> certificateChain = new List<X509Certificate>
{
signingCertificate,
intermediateCertificate
};
In the sign method of MyExternalSignatureContainer I now construct and return the signature container:
public byte[] Sign(Stream s)
{
string hashAlgorithm = "SHA-256";
PdfPKCS7 sgn = new PdfPKCS7(null, chain, hashAlgorithm, false);
byte[] ocspResponse = Convert.FromBase64String("Base64 encoded DER representation of the OCSP response received from signing service");
byte[] hash = DigestAlgorithms.Digest(s, hashAlgorithm);
byte[] sh = sgn.getAuthenticatedAttributeBytes(hash, ocspResponse, null, CryptoStandard.CMS);
string messageDigest = Sha256_hash(sh);
// messageDigest sent to signing service
byte[] signatureAsByte = StringToByteArray("Hex encoded SignatureValue received from signing service");
sgn.SetExternalDigest(signatureAsByte, null, "RSA");
ITSAClient tsaClient = new MyITSAClient();
return sgn.GetEncodedPKCS7(hash, tsaClient, ocspResponse, null, CryptoStandard.CMS);
}
public class MyITSAClient : ITSAClient
{
public int GetTokenSizeEstimate()
{
return 0;
}
public IDigest GetMessageDigest()
{
return new Sha256Digest();
}
public byte[] GetTimeStampToken(byte[] imprint)
{
string hashedImprint = HexEncode(imprint);
// Hex encoded Imprint sent to signing service
return Convert.FromBase64String("Base64 encoded DER representation of TimeStampToken received from signing service");
}
}
Still get these messages:
"The signer's identity is unknown because it has not been included in the list of trusted identities and none or its parent
certificates are trusted identities"
"The signature is timestamped, but the timestamp could not be verified"
Further help is very much appreciated again!
"What am I doing wrong?"
The problem is that on one hand you start constructing a CMS signature container using a PdfPKCS7 instance
PdfPKCS7 sgn = new PdfPKCS7(null, chain, hashAlgorithm, false);
and for the calculated document digest hash retrieve the signed attributes to be
byte[] sh = sgn.getAuthenticatedAttributeBytes(hash, null, null, CryptoStandard.CMS);
to send them for signing.
So far so good.
But then you ignore the CMS container you started constructing but instead inject the naked signature bytes you got from your service into the PDF.
This cannot work as your signature bytes don't sign the document directly but instead they sign these signed attributes (and, therefore, indirectly the document as the document hash is one of the signed attributes). Thus, by ignoring the CMS container under construction you dropped the actually signed data...
Furthermore, the subfilter ADBE_PKCS7_DETACHED you use promises that the embedded signature is a full CMS signature container, not a few naked signature bytes, so the format also is wrong.
How to do it instead?
Instead of injecting the naked signature bytes you got from your service into the PDF as is, you have to set them as external digest in the PdfPKCS7 instance in which you originally started constructing the signature container:
sgn.SetExternalDigest(sig, null, ENCRYPTION_ALGO);
(ENCRYPTION_ALGO must be the encryption part of the signature algorithm, I assume in your case "RSA".)
and then you can retrieve the generated CMS signature container:
byte[] encodedSig = sgn.GetEncodedPKCS7(hash, null, null, null, CryptoStandard.CMS);
Now this is the signature container to inject into the document using MyExternalSignatureContainer:
IExternalSignatureContainer external = new MyExternalSignatureContainer(encodedSig);
MakeSignature.SignDeferred(reader, "Signature1", os, external);
Remaining issues
Having corrected your code Adobe Reader still warns about your signatures:
"The signer's identity is unknown because it has not been included in the list of trusted identities and none or its parent certificates are trusted identities"
This warning is to be expected and correct!
The signer's identity is unknown because your signature service uses merely a demo certificate, not a certificate for production use:
As you see the certificate is issued by "GlobalSign Non-Public HVCA Demo", and non-public demo issuers for obvious reasons must not be trusted (unless you manually add them to your trust store for testing purposes).
"The signature is timestamped, but the timestamp could not be verified"
There are two reasons why Adobe does not approve of your timestamp:
On one hand, just like above, the timestamp certificate is a non-public, demo certificate ("DSS Non-Public Demo TSA Responder"). Thus, there is no reason for the verifier to trust your timestamp.
On the other hand, though, there is an actual error in your timestamp'ing code, you apply the hashing algorithm twice! In your MyITSAClient class you have
public byte[] GetTimeStampToken(byte[] imprint)
{
string hashedImprint = Sha256_hash(imprint);
// hashedImprint sent to signing service
return Convert.FromBase64String("Base64 encoded DER representation of TimeStampToken received from signing service");
}
The imprint parameter of your GetTimeStampToken implementation is already hashed, so you have to hex encode these bytes and send them for timestamp'ing. But you apply your method Sha256_hash which first hashes and then hex encodes this new hash.
Thus, instead of applying Sha256_hash merely hex encode the imprint!

AES encryption and decryption resulting in file different than original

I've decided to implement encryption for file transfers in my service. File transfers prior to this were not encrypted, and they were sent and received flawlessly with the exact same number of bytes.
Now I've introduced asymmetrical and symmetrical encryption into the mix to encrypt the data as it passes over the TCP protocol. I use asymmetrical to do an initial handshake passing the symmetrical key to the other party encrypted by the asymmetric public key. From then on out, the receiver of the file calls the sender periodically, and the sender generates a new initialization vector, encrypts the data with the symmetric key, and sends it over to be decrypted by the receiver using the IV and same symmetric key.
The chunk size I'm using is 2mb, such that the byte size of the generated chunks, with exception to the last chunk which varies, is 2097152. When AES encrypts this file with PaddingMode.PKCS7 and CipherMode.CBC, the resulting byte size is 2097168. It's gained about 16 bytes during the encryption process.
Now initially I thought this is where my problem was, but when I decrypt the data on the receiving end, it goes back to the 2097152 byte length and I write it to the file. I've proven to myself that it does indeed encrypt and decrypt the data.
On a small enough file, the file sizes from the original to the sender seem to be exactly the same. However, as I step up to larger file sizes, there exists a descrepency. On a video file(Wildlife.wmv from windows 7 install) of size 26,246,026 bytes, I am instead receiving a finished transfer that is of 26,246,218 bytes.
Why is there this size difference? What am I doing wrong here?
Here's some of my code.
For my encryption I am using the following class to encrypt or decrypt, returning a result in the form of a byte array.
public class AesCryptor
{
public byte[] Encrypt(byte[] data, byte[] key, byte[] iv)
{
using (SymmetricAlgorithm aes = new AesManaged())
{
aes.Key = key;
aes.IV = iv;
aes.Padding = PaddingMode.PKCS7;
aes.Mode = CipherMode.CBC;
using (ICryptoTransform encryptor = aes.CreateEncryptor(key, iv))
{
return Crypt(data, key, iv, encryptor);
}
}
}
public byte[] Decrypt(byte[] data, byte[] key, byte[] iv)
{
using (SymmetricAlgorithm aes = new AesManaged())
{
aes.Key = key;
aes.IV = iv;
aes.Padding = PaddingMode.PKCS7;
aes.Mode = CipherMode.CBC;
using (ICryptoTransform decryptor = aes.CreateDecryptor(key, iv))
{
return Crypt(data, key, iv, decryptor);
}
}
}
private byte[] Crypt(byte[] data, byte[] key, byte[] iv, ICryptoTransform transform)
{
using (MemoryStream memoryStream = new MemoryStream())
{
using (CryptoStream cryptoStream = new CryptoStream(memoryStream, transform, CryptoStreamMode.Write))
{
cryptoStream.Write(data, 0, data.Length);
cryptoStream.FlushFinalBlock();
}
return memoryStream.ToArray();
}
}
}
The sender of the file is encrypting the data(after the handshake of the private symmetric key) with this code(and a lot more that doesn't pertain to the actual encryption process. Note the chunkedFile.NextChunk(). This calls a method on the class that is doing the file chunking for me, returning 2mb chunk sizes unless the final size is smaller.
byte[] buffer;
byte[] iv = new byte[symmetricEncryptionBitSize / 8];
using (var rngCrypto = new RNGCryptoServiceProvider())
rngCrypto.GetBytes(iv);
AesCryptor cryptor = new AesCryptor();
buffer = cryptor.Encrypt(chunkedFile.NextChunk(), symmetricPrivateKey, iv);
The code below is what the receiver of the file uses(not all of it, this is what pertains to the decrypting of the data). The data is being written to a file stream(writer).
FileMessage message = hostChannel.ReceiveFile();
moreChunks = message.FileMetaData.MoreChunks;
UpdateTotalBytesTransferred(message);
writer.BaseStream.Position = filePosition;
byte[] decryptedStream;
// Copy the message stream out to a memory stream so we can work on it afterwards.
using (var memoryStream = new MemoryStream())
{
message.ChunkData.CopyTo(memoryStream);
decryptedStream = cryptor.Decrypt(memoryStream.ToArray(), symmetricPrivateKey, message.FileMetaData.InitializationVector);
}
writer.Write(decryptedStream);
By the way, in case it is needed, NextChunk is a very simple method.
public byte[] NextChunk()
{
if (MoreChunks) // If there are more chunks, procede with the next chunking operation, otherwise throw an exception.
{
byte[] buffer;
using (BinaryReader reader = new BinaryReader(File.OpenRead(FilePath)))
{
reader.BaseStream.Position = CurrentPosition;
buffer = reader.ReadBytes((int)MaximumChunkSize);
}
CurrentPosition += buffer.LongLength; // Sets the stream position to be used for the next call.
return buffer;
}
else
throw new InvalidOperationException("The last chunk of the file has already been returned.");
}
EDIT: It seems that for every chunk transferred, and thus every encryption, I am gaining 16bytes in file size. This does not happen with extremely small file sizes.
Well I solved the issue.
It turns out I was sending in the message data the chunkLength of the encrypted chunk data. So for every chunk I sent, even though I decrypted and wrote the correct filedata, I was advancing the stream position by the length of the encrypted data. This means every time I decrypted, when transferring more than 1 chunk(this is why the small files of only 1 chunk size didn't have problems) I was adding 16 bytes to the file size.
People helping me probably wouldn't have been able to figure this out, because I didn't include all of the data in the client side or the server side to see this. But thankfully I managed to answer it myself.
On the sender side, I was creating my FileMessage like this.
FileMessage message = new FileMessage();
message.FileMetaData = new FileMetaData(chunkedFile.MoreChunks, chunkedFile.ChunkLength, chunkedFile.CurrentPosition, iv);
message.ChunkData = new MemoryStream(buffer);
If you see the second parameter of FileMetaData constructor, I'm passing in chunkedFile.ChunkLength which is supposed to be the length of the chunk. I was doing this on the encrypted chunk data, which resulted in sending the incorrect chunk length.
The client on the other hand, was receiving this extra information. If you look near the end, you'll see the code filePosition += message.FileMetaData.ChunkLength;. I was using that erroneous chunkLength to advance the file position. It turns out that setting of the streamPosition was not even necessary.
using (BinaryWriter writer = new BinaryWriter(File.OpenWrite(fileWritePath)))
{
writer.BaseStream.SetLength(0);
while (moreChunks)
{
FileMessage message = hostChannel.ReceiveFile();
moreChunks = message.FileMetaData.MoreChunks;
UpdateTotalBytesTransferred(message);
writer.BaseStream.Position = filePosition;
byte[] decryptedStream;
// Copy the message stream out to a memory stream so we can work on it afterwards.
using (var memoryStream = new MemoryStream())
{
message.ChunkData.CopyTo(memoryStream);
Debug.WriteLine("Received Encrypted buffer Length: " + memoryStream.Length);
decryptedStream = cryptor.Decrypt(memoryStream.ToArray(), symmetricPrivateKey, message.FileMetaData.InitializationVector);
Debug.WriteLine("Received Decrypted buffer Length: " + decryptedStream.Length);
}
writer.Write(decryptedStream);
TotalBytesTransferred = message.FileMetaData.FilePosition;
filePosition += message.FileMetaData.ChunkLength;
}
OnTransferComplete(this, EventArgs.Empty);
StopSession();
}
Such a simple bug, but one that wasn't leaping out at me quickly at all.

Security with RijndaelManaged and ServicePointManager

I have a security question about RijndaelManaged and
ServicePointManager.
I have implemented a system where C# application is encrypting data, such as user credentials and some XML data. Then I use WebClient to send encrypted user credentials with some encrypted XML document containing instructions - to my Tomcat Java Web application. The job of the Java Application: is to decrypt user credentials and XML instructions – perform instructions and respond back to C# with an encrypted XML result.
All connections from my C# application to Tomcat server are with SSL enabled (Self signed certificate for now).
First Question: Given the fact that my C# application by default always connecting to my Server (only) with SSL enabled. Can I simply implement the call back function as:
ServicePointManager.ServerCertificateValidationCallback = delegate { return true; };
As I understand that the call back function is used to Identify and validate certificate used by the Server I’m connecting to. If I were to give that application to – say one of my clients to connect to my Server (with SSL enabled) – is the code above satisfactory? If client uses my application to connect to another server that is not known and I have no Idea about its SSL certificate status – the code above should be replaced with an actual certificate validation function. Does my question make sense?
Second Question: I have encryption/decryption implemented using RijndaelManaged in my C# application. But the key I’m using is part of the C# application – the application is obfuscated. As I understand this is not a secure way.
Is there a reliable way for the C# application to receive the encryption/decryption key from my Web application. Or is there a way for the key to be generated in C# application that can be used by Web application to decrypt the data – if so: how do I generate that key and most important how do I send it to the server in a reliable secure way. Since the connection is SSL – can the key simply be a part of the encrypted stream?
Here is code that I’m using for encryption in my C# app.
private const string KEY = "samplekey";
private const int KEY_SIZE = 128;
private const int KEY_BITS = 16;
private string Encrypt(string textToEncrypt)
{
RijndaelManaged rijndaelCipher = new RijndaelManaged();
rijndaelCipher.Mode = CipherMode.CBC;
rijndaelCipher.Padding = PaddingMode.PKCS7;
rijndaelCipher.KeySize = KEY_SIZE;
rijndaelCipher.BlockSize = KEY_SIZE;
byte[] pwdBytes = Encoding.UTF8.GetBytes(KEY);
byte[] keyBytes = new byte[KEY_BITS];
int len = pwdBytes.Length;
if (len > keyBytes.Length)
{
len = keyBytes.Length;
}
Array.Copy(pwdBytes, 0, keyBytes, 0, len);
rijndaelCipher.Key = keyBytes;
rijndaelCipher.IV = keyBytes;
ICryptoTransform transform = rijndaelCipher.CreateEncryptor();
byte[] plainText = Encoding.UTF8.GetBytes(textToEncrypt);
return System.Convert.ToBase64String(transform.TransformFinalBlock(plainText, 0, plainText.Length));
}