Getting public key from DNSKEY RR public key field using Python - cryptography

I am trying to parse and validate DNSSEC responses without using any DNS specific libraries. I am able to get the hexstring representation of a RSA key from the public key field value present in the DNSKEY RRs. According to RFC 8017, the RSA public key is represented with the ASN.1 type RSAPublicKey format which has a modulus and exponent. However, it doesn't specify anything more.
The hexstring(same as in Wireshark) is
"03010001ac1f62b7f85d62c550211fd70ddbbca7326cde13dca235f26f76a5dd5872db601d775ecd189955ed96e749fd8e8e6af3e133e8a8eb8b8afc25730c6318f949de9436fde6ea280b5ccbc09a43ee357617905690fdc09cda06bc5ad3bcd1bc4e43de9a4769ff83453e96a74642b23daabae00398539cfca56b04c200776c4841724cb09674b519eb7e3506a3e08e4f96b5a733425a1c55eecb1613552c022b246b27141652d907cdbc6e30b5f3341a1ba5dfbb503edddbd01e85f1c4206642cfb312e14f2772fe8b66143ba847382e95fb86ba215342ae9cca803655bccadef1123e06f3cf1626840e11200b1acda118c50805c6eacfd271d930b93f2e332d9521"
I saw other similar posts and tried to follow the solutions. Most of the solutions try to get it from a pem file or binary data or base64 encoded form. When I try to convert the hex to those forms and use the solution, I get errors like 'RSA key format not supported' etc..
Is there anyway I can get the public key from the hex? I would really appreciate any inputs! Thanks!

I finally managed to find the solution.
According to RFC 3110 section 2, we can split the given value into exponent length, exponent and modulus. I split them as specified and converted the hexadecimal to integer. The text from RFC is below
The structure of the algorithm specific portion
of the RDATA part of such RRs is as shown below.
Field Size
----- ----
exponent length 1 or 3 octets (see text)
exponent as specified by length field
modulus remaining space
For interoperability, the exponent and modulus are each limited to
4096 bits in length. The public key exponent is a variable length
unsigned integer. Its length in octets is represented as one octet
if it is in the range of 1 to 255 and by a zero octet followed by a
two octet unsigned length if it is longer than 255 bytes. The public
key modulus field is a multiprecision unsigned integer. The length
of the modulus can be determined from the RDLENGTH and the preceding
RDATA fields including the exponent.

Related

how to restore rsa public key from modulus and exponent in Excel VBA

Problem: as title
What i got from web server to restore public key:
modulus: 9a91586b02a923d79302c5be83f25861452b78e59bd1d383045addc9debad1db9675726276a10f90bf0d0ae4880dbe4a54c821fffdb2f1394faf9df56d87408bb97398dcb2319fab7f53ee59fdb58def6f55ca91dbd9f2af65a4a36779f5353ec212d4bf99ba9197108acb2337d31d3efae038018dcb29665510641f32ac99e8152f297e2056ea14d9bd62350797b2da8edc23574326f57e1563952006dbbb133e2b15d2a4dd6a55aa7debfb28ba610d7e637022957b063e605985c402be41dd8dc3c3852645034f0b29f4fad0f45419f03f1bbb71c0dc54c1069a081d3edb73fd93a204edd0a99459b38e6486d41171328c5f53913696f8f2b1718019db9b65
exponent: 10001
What i have tried:
First base64encode the modulus and exponent then dump them into xml format, here's the result:
<RSAKeyValue><Modulus>OWE5MTU4NmIwMmE5MjNkNzkzMDJjNWJlODNmMjU4NjE0NTJiNzhlNTliZDFkMzgzMDQ1YWRk
YzlkZWJhZDFkYjk2NzU3MjYyNzZhMTBmOTBiZjBkMGFlNDg4MGRiZTRhNTRjODIxZmZmZGIy
ZjEzOTRmYWY5ZGY1NmQ4NzQwOGJiOTczOThkY2IyMzE5ZmFiN2Y1M2VlNTlmZGI1OGRlZjZm
NTVjYTkxZGJkOWYyYWY2NWE0YTM2Nzc5ZjUzNTNlYzIxMmQ0YmY5OWJhOTE5NzEwOGFjYjIz
MzdkMzFkM2VmYWUwMzgwMThkY2IyOTY2NTUxMDY0MWYzMmFjOTllODE1MmYyOTdlMjA1NmVh
MTRkOWJkNjIzNTA3OTdiMmRhOGVkYzIzNTc0MzI2ZjU3ZTE1NjM5NTIwMDZkYmJiMTMzZTJi
MTVkMmE0ZGQ2YTU1YWE3ZGViZmIyOGJhNjEwZDdlNjM3MDIyOTU3YjA2M2U2MDU5ODVjNDAy
YmU0MWRkOGRjM2MzODUyNjQ1MDM0ZjBiMjlmNGZhZDBmNDU0MTlmMDNmMWJiYjcxYzBkYzU0
YzEwNjlhMDgxZDNlZGI3M2ZkOTNhMjA0ZWRkMGE5OTQ1OWIzOGU2NDg2ZDQxMTcxMzI4YzVm
NTM5MTM2OTZmOGYyYjE3MTgwMTlkYjliNjU=</Modulus>
<Exponent>MTAwMDE=</Exponent>
</RSAKeyValue>
Then invoke RSACryptoServiceProvider.FromXmlString(publickey)
and error occurred:
Run-time error -2146893819(80090005)
Automation error
Bad data
Then i tried to generate xml format public key RSACryptoServiceProvider.ToXmlString(False)
and got
<RSAKeyValue><Modulus>ph0JbRrKHFY5sfmVa9cDPICAtYfT6OKF4KcjgBIKIuFRz3azyCCiE12qP0ZbuHqwb6YQxg6778NJK8S0Xvft6Fu9s0FCO7zUxVRaIw6gumOAV2ih/s+S9pFuxMf3k5w2v5iMA6TFjxS72kCa4O8iIXhOG4u05+o2fRC2cwEYVSk=</Modulus><Exponent>AQAB</Exponent></RSAKeyValue>
amazingly this key could be recognized by RSACryptoServiceProvider.FromXmlString
I didn't see much difference between the former one and the latter one, why does error occur in the first case?
Is there any other way to restore the public key given modulus and exponent?
You received a hex (base 16) version of the integer modulus and public exponent. You then converted that base 16 version into base64. However, the XML is supposed to contain the base64 encoding of the integers as byte arrays in big-endian format. You can easily do that from the hex by converting every two hex characters into its byte value. The result should look something like
<RSAKeyValue><Modulus>mpFYawKpI9eTAsW+g/JYYUUreOWb0dODBFrdyd660duWdXJidqEPkL8NCuSIDb5KVMgh//2y8TlPr531bYdAi7lzmNyyMZ+rf1PuWf21je9vVcqR29nyr2Wko2d59TU+whLUv5m6kZcQissjN9MdPvrgOAGNyylmVRBkHzKsmegVLyl+IFbqFNm9YjUHl7LajtwjV0Mm9X4VY5UgBtu7Ez4rFdKk3WpVqn3r+yi6YQ1+Y3AilXsGPmBZhcQCvkHdjcPDhSZFA08LKfT60PRUGfA/G7txwNxUwQaaCB0+23P9k6IE7dCplFmzjmSG1BFxMoxfU5E2lvjysXGAGdubZQ==</Modulus><Exponent>AQAB</Exponent></RSAKeyValue>

computing the exchange hash for ecdsa-sha2-nistp256

I am writing code for an SSH server and can not get past the Elliptic Curve Diffie-Hellman Key Exchange Reply part of the connection. The client also closes the connection and says "Host Key does not match the signature supplied".
I am using putty as the client and a PIC micro-controller is running the server code.
From RFC 5656 [SSH ECC Algorithm Integration] :
"The hash H is formed by applying the algorithm HASH on a
concatenation of the following:
string V_C, client's identification string (CR and LF excluded)
string V_S, server's identification string (CR and LF excluded)
string I_C, payload of the client's SSH_MSG_KEXINIT
string I_S, payload of the server's SSH_MSG_KEXINIT
string K_S, server's public host key
string Q_C, client's ephemeral public key octet string
string Q_S, server's ephemeral public key octet string
mpint K, shared secret
"
the host key algorithm and key exchange algorithm is ecdsa-sha2-nistp256 and ecdh-sha2-nistp256 respectively.
referring to RFC 4251 for data type representations, as well as the source code in openSHH (openBSD) this is what I have concatenated.
4 bytes for then length of V_C followed by V_C
4 bytes for then length of V_S followed by V_S
4 bytes for length of I_C followed by I_C (payload is from Message Code to the start of Random Padding)
4 bytes for length of I_S followed by I_S (payload is from Message Code to the start of Random Padding)
4 bytes for the length of K_S followed by K_S (for K_S I used the same group of bytes that is used to calculate the fingerprint)
4 bytes for the length of Q_C followed by Q_C (i used the uncompressed string which has length of 65 - 04||X-coordinate||Y-coordinate)
4 bytes for the length of Q_S followed by Q_S
4 bytes for the length of K followed by K (length is 32 or 33 depending is the leading bit is set or not. If it is set then K is preceded by a 00 byte)
Once concatenated I hash it with SHA256 because I'm using NISTP256. SHA256 outputs 32 bytes which is the size of the curve, so I take the whole SHA256 output and perform the signature algorithm on it.
I can never get the correct signature from my message concatenation.
I know my signature algorithm is correct because given the message hash output I can get the correct signature.
I know my shared secret is correct because I get the same output as online shared secret calculators.
I know the SHA256 is correct because I get the same result using online calculators.
This leads me to assume the error is in the concatenation of the exchange hash.
Any help is greatly appreciated, thanks.
ECDSA signature generation is non-deterministic, i.e. part of the input is the hash and part of the input consists of random bytes. So whatever you do, you will always get a different signature. This is all right because signature verification will still work.
The only way to get a repeated signature is to mess with the random number generator (during testing, you don't want to sign two values using the same random number: you'd expose the private key!).

Encryption and decryption with same length of characters in sql server

I want to encrypt string with the same length of character string and decryption with same length of character string using sql server. For Example:
Encryption
Input: Encrypt("002581") -- with 6 characters
Result: a&pE12 -- output with same 6 characters in encrypted form
Decryption
Input: Decrypt("a&pE12") -- with 6 characters
Result: 002581 -- output with same 6 characters in decrypted form
Short answer: there is no such secure encryption scheme.
Longer answer: any kind of encryption scheme obfuscates content of a plain text to be indistinguishable from other messages from the same message space. To do so all cipher texts produced must be of the same length (ideally) regardless of an input plain text. At least the length should be different from a length of a plain text.
So please, don't even consider such an encryption technique. It's insecure by definition.

Is AES encrypted + base64 encoded value still unique?

I encrypt a UTF-8 string + current timestamp using AES 128bit CTR mode with a 4 bytes random initialization vector, which is generated by NodeJS's crypto.randomBytes().
Finally I base64 encode the whole output, using a URL-friendly base64 variant.
Question: the AES output should be unique due to timestamp + random data. But is final base64 string also guaranteed to be unique?
Thanks in advance!
Yes, Base64 is a reversible transformation, so if input is unique than output will be also unique.

Hexadecimal numbers vs. hexadecimal enocding (with base64 as well)

Encoding with hexadecimal numbers seems to be different from using hexadecimals to represent numbers. For example, then hex number 0x40 to me should be equal to 64, or BA_{64}, but when I put it through this hex to base64 converter, I get the output: QA== which to me is equal to some number times 64. Why is this?
Also when I check the integer value of the hex string deadbeef I get 3735928559, but when I check it other places I get: 222 173 190 239. Why is this?
Addendum: So I guess it is because it is easier to break the number into bit chunks than treat it as a whole number when encoding? That is pretty confusing to me but I guess I get it.
You may wish to read this:
http://en.wikipedia.org/wiki/Base64
In summary, base64 specifies a specific encoding, which involves using different values for letters than their ASCII encoding.
For the second part, one source is treating the entire string as a 32 bit integer, and the other is dividing it into bytes and giving the value of each byte.