What's the difference between C# SHA256Managed and cryptopp::SHA256 - cryptography

I'm trying to replace MS SHA256Managed function by cryptopp::SHA256.
Here's the C# code
internal byte[] GenerateKey(byte[] keySeed, Guid keyId)
{
byte[] truncatedKeySeed = new byte[30];
Array.Copy(keySeed, truncatedKeySeed, truncatedKeySeed.Length);
Console.WriteLine("Key Seed");
foreach (byte b in truncatedKeySeed)
{
Console.Write("0x" + Convert.ToString(b, 16) + ",");
}
Console.WriteLine();
//
// Get the keyId as a byte array
//
byte[] keyIdAsBytes = keyId.ToByteArray();
SHA256Managed sha_A = new SHA256Managed();
sha_A.TransformBlock(truncatedKeySeed, 0, truncatedKeySeed.Length, truncatedKeySeed, 0);
sha_A.TransformFinalBlock(keyIdAsBytes, 0, keyIdAsBytes.Length);
byte[] sha_A_Output = sha_A.Hash;
Console.WriteLine("sha_a:" + sha_A_Output.Length);
foreach (byte b in sha_A_Output)
{
Console.Write("0x" + Convert.ToString(b, 16) + ",");
}
Console.WriteLine();
.....
}
The output result:
Key Seed
0x5d,0x50,0x68,0xbe,0xc9,0xb3,0x84,0xff,0x60,0x44,0x86,0x71,0x59,0xf1,0x6d,0x6b,0x75,0x55,0x44,0xfc,0xd5,0x11,0x69,0x89,0xb1,0xac,0xc4,0x27,0x8e,0x88
Key ID
0x39,0x68,0xe1,0xb6,0xbd,0xee,0xf6,0x4f,0xab,0x76,0x8d,0x48,0x2d,0x8d,0x2b,0x6a,
sha_a:32
0x7b,0xec,0x8f,0x1b,0x60,0x4e,0xb4,0xab,0x3b,0xb,0xbd,0xb8,0x71,0xd6,0xba,0x71,0xb1,0x26,0x41,0x7d,0x99,0x55,0xdc,0x8e,0x64,0x76,0x15,0x23,0x1b,0xab,0x76,0x62,
The replacement function by Crypto++ as follows:
byte key_seed[] = { 0x5D, 0x50, 0x68, 0xBE, 0xC9, 0xB3, 0x84, 0xFF, 0x60, 0x44, 0x86, 0x71, 0x59, 0xF1, 0x6D, 0x6B, 0x75, 0x55, 0x44, 0xFC,0xD5, 0x11, 0x69, 0x89, 0xB1, 0xAC, 0xC4, 0x27, 0x8E, 0x88 };
byte key_id[] = { 0x39,0x68,0xe1,0xb6,0xbd,0xee,0xf6,0x4f,0xab,0x76,0x8d,0x48,0x2d,0x8d,0x2b,0x6a };
byte truncated_key_seed[sizeof(key_seed)];
memset( truncated_key_seed,0,sizeof(truncated_key_seed));
memcpy( key_seed, truncated_key_seed, sizeof(key_seed) );
byte output[SHA256::DIGESTSIZE];
memset(output,0,sizeof(output));
SHA256 sha_a;
sha_a.Update(truncated_key_seed,sizeof(key_seed));
sha_a.Update(key_id,sizeof(key_id));
sha_a.Final(output);
printf("size:%lu\n",sizeof(output));
PrintHex(output,sizeof(output));
But the output hash value is
DB 36 C9 F6 F7 29 6D 6F 52 21 DA 9F 55 1D AE BC 3E 5A 15 DF E1 37 07 EE 8F BC 73 61 5F D6 E1 C3
It's different with sha_a result by C#.
From the MSDN and Cryptopp reference, the SHA256Managed::TransformBlock and SHA256Managed::TransformFinalBlock did the same thing with Cryptopp::Update and Cryptopp::Final.
What's the difference between SHA256Managed and cryptopp::SHA256 cause this result?

Seems like a bug in your code to me.
sha_a.Update(truncated_key_seed,sizeof(key_seed));
Make sure that the truncated_key_seed is identical in both versions, especially the bytes not included in the original key_seed...

Related

Mifare DESFire EV1 4K AES authentication issue

Please, can someone tell me what I'm doing wrong. I have to AES authenticate my card. The card is Mifare DESFire EV1 4K and the reader is Omnikey 5121. I followed some examples here on Stack but I always fail in the last step where card's rotated RndA is not equal to my rotated RndA. Is something wrong with the AES configuration?
public static byte[] Authenticate_AES(this SCardReader reader, byte[] key)
{
using (var aes = Aes.Create())
{
aes.Key = key;
aes.Padding = PaddingMode.None;
aes.Mode = CipherMode.CBC;
aes.BlockSize = 128;
aes.IV = SCardUtils.StringToByteArray("00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00");
var decryptor = aes.CreateDecryptor();
var encryptor = aes.CreateEncryptor();
var rnd = new Random();
//Get encrypted RandB from the card
var rAPDU = reader.Transmit(0x90, 0xAA, 0, 0, SCardUtils.StringToByteArray("00"));
if (!rAPDU.HasData)
throw new Exception("RandB_enc is null");
//Encrypted RndB
var RndB_enc = new byte[16];
rAPDU.Data.CopyTo(RndB_enc, 0);
SCardUtils.ShowBytes(RndB_enc, "RndB_enc");
//Decrypt encrypted RndB
var RndB = decryptor.TransformFinalBlock(RndB_enc, 0, RndB_enc.Length);
SCardUtils.ShowBytes(RndB, "RndB");
//Rotate RndB 1 byte to the left
var RndB_rot = SCardUtils.RotateLeft(RndB);
SCardUtils.ShowBytes(RndB_rot, "RndB_rot");
//Generate random RndA
var RndA = new byte[16];
rnd.NextBytes(RndA);
SCardUtils.ShowBytes(RndA, "RndA");
//Concatenate RndA and RndB_rot
var RndAB_rot = RndA.Concat(RndB_rot).ToArray();
SCardUtils.ShowBytes(RndAB_rot, "RndAB_rot");
//Encypt RndAB_rot
var dataToSend = encryptor.TransformFinalBlock(RndAB_rot, 0, RndAB_rot.Length);
SCardUtils.ShowBytes(dataToSend, "Encrypted data");
rAPDU = reader.Transmit(0x90, 0xAF, 0, 0, dataToSend);
if (!rAPDU.HasData)
throw new Exception("rAPDU data is null");
//Encrypted RndA_rot
var RndA_rot_enc = new byte[16];
rAPDU.Data.CopyTo(RndA_rot_enc, 0);
SCardUtils.ShowBytes(RndA_rot_enc, "RndA_rot_enc");
//Decrypt encrypted RndA_rot
var RndA_rot_dec = decryptor.TransformFinalBlock(RndA_rot_enc, 0, RndA_rot_enc.Length);
SCardUtils.ShowBytes(RndA_rot_dec, "RndA_rot_dec");
var RndA_rot = SCardUtils.RotateLeft(RndA);
SCardUtils.ShowBytes(RndA_rot, "RndA_rot");
//Compare RndA_rot with RndA rotated to the left
if (!SCardUtils.IsEqualTo(RndA_rot_dec, RndA_rot))
throw new Exception($"Error authenticating card. The values are not equal.");
var sessionKey = new byte[16];
Array.Copy(RndA, 0, sessionKey, 0, 4);
Array.Copy(RndB, 0, sessionKey, 4, 4);
Array.Copy(RndA, 12, sessionKey, 8, 4);
Array.Copy(RndB, 12, sessionKey, 12, 4);
return sessionKey;
}
}
Here is the output:
Reader name: OMNIKEY CardMan 5x21-CL 0
RndB_enc: 08 DD A2 12 57 43 6C F7 75 98 78 9E 6C 0A A7 06
RndB: 16 C8 35 7A 4A 36 29 D2 F0 86 26 AD FA CA 81 9F
RndB_rot: C8 35 7A 4A 36 29 D2 F0 86 26 AD FA CA 81 9F 16
RndA: 77 93 A5 8D 0E 0D 88 88 22 C3 40 9C 26 67 95 35
RndAB_rot: 77 93 A5 8D 0E 0D 88 88 22 C3 40 9C 26 67 95 35 C8 35 7A 4A 36 29 D2 F0 86 26 AD FA CA 81 9F 16
Data: 4D BD 7A E8 B8 6C 00 5F E4 B5 B5 42 7B AE 51 39 25 77 CB 60 83 6A E8 15 B9 9D FD A9 FD A7 75 9F
RndA_rot_enc: D6 CB CF 08 5F 8A E8 6C 30 95 34 6F DD CF 4F FA
RndA_rot_dec: 6B 70 54 39 CD 8E 97 42 E2 A5 FF E3 90 95 46 E0
RndA_rot: 93 A5 8D 0E 0D 88 88 22 C3 40 9C 26 67 95 35 77
So you can see RndA_rot_dec and RndA_rot are not equal and I can't figure it out why.
Thank you all in advance.
I finally got the solution. I also modified the code a bit so it can be more understandable for everyone who will have a problem as I did. For several weeks.
public static byte[] Authenticate_AES(this SCardReader reader, byte[] key, byte[] IV)
{
//Get encrypted RndB from the tag
var rAPDU = reader.Transmit(0x90, 0xAA, 0, 0, SCardUtils.StringToByteArray("00"));
if (!rAPDU.HasData)
throw new Exception("RandB_enc is null");
var aes = Aes.Create();
aes.Mode = CipherMode.CBC;
aes.KeySize = 128;
aes.BlockSize = 128;
aes.Padding = PaddingMode.None;
aes.Key = key;
aes.IV = IV; //16 bytes of zeros
//Encrypted RndB from the tag
var RndB_enc = rAPDU.Data.ToArray();
SCardUtils.ShowBytes(RndB_enc, "RndB_enc");
var decryptor = aes.CreateDecryptor();
//Decrypt encrypted RndB
var RndB = decryptor.TransformFinalBlock(RndB_enc, 0, RndB_enc.Length);
SCardUtils.ShowBytes(RndB, "RndB");
//Rotate RndB 1 byte to the left
var RndB_rot = SCardUtils.RotateLeft(RndB);
SCardUtils.ShowBytes(RndB_rot, "RndB_rot");
var rnd = new Random();
//Generate random RndA
var RndA = new byte[16];
rnd.NextBytes(RndA);
SCardUtils.ShowBytes(RndA, "RndA");
//Concatenate RndA and RndB_rot
var RndAB_rot = RndA.Concat(RndB_rot).ToArray();
SCardUtils.ShowBytes(RndAB_rot, "RndAB_rot");
//IV is now encrypted RndB received from the tag
aes.IV = RndB_enc;
var encryptor = aes.CreateEncryptor();
//Encypt RndAB_rot
var RndAB_rot_enc = encryptor.TransformFinalBlock(RndAB_rot, 0, RndAB_rot.Length);
SCardUtils.ShowBytes(RndAB_rot_enc, "RndAB_rot_enc");
rAPDU = reader.Transmit(0x90, 0xAF, 0, 0, RndAB_rot_enc);
if (!rAPDU.HasData)
throw new Exception("rAPDU data is null");
//Encrypted RndA_rot from the tag
var RndA_rot_enc = rAPDU.Data.ToArray();
SCardUtils.ShowBytes(RndA_rot_enc, "RndA_rot_enc");
//IV is now the last 16 bytes of RndAB_rot_enc
aes.IV = RndAB_rot_enc.Skip(16).Take(16).ToArray();
decryptor = aes.CreateDecryptor();
//Decrypt encrypted RndA_rot
var RndA_rot = decryptor.TransformFinalBlock(rAPDU.Data, 0, rAPDU.Data.Length);
SCardUtils.ShowBytes(RndA_rot, "RndA_rot");
//Compare RndA_rot_dec with RndA_rot
if (!SCardUtils.IsEqualTo(RndA_rot, SCardUtils.RotateLeft(RndA)))
throw new Exception($"Error authenticating card. The values are not equal.");
var sessionKey = new byte[16];
Array.Copy(RndA, 0, sessionKey, 0, 4);
Array.Copy(RndB, 0, sessionKey, 4, 4);
Array.Copy(RndA, 12, sessionKey, 8, 4);
Array.Copy(RndB, 12, sessionKey, 12, 4);
aes.Clear();
return sessionKey;
}
The problem was with Init Vector. It is changing through the three-pass authentication.
Here is the final output:
Reader name: OMNIKEY CardMan 5x21-CL 0
RndB_enc: 6F 40 6D 9D 51 7A 2C 9E 88 C9 2C 84 80 94 E3 F7
RndB: FB FB A7 5F 54 97 D7 CA 4B 15 07 F1 A0 D1 A2 68
RndB_rot: FB A7 5F 54 97 D7 CA 4B 15 07 F1 A0 D1 A2 68 FB
RndA: 87 7B BA 48 4D 01 14 CE 4D 8E 33 A5 1B 0F 00 E9
RndAB_rot: 87 7B BA 48 4D 01 14 CE 4D 8E 33 A5 1B 0F 00 E9 FB A7 5F 54 97 D7 CA 4B 15 07 F1 A0 D1 A2 68 FB
RndAB_rot_enc: 0B 8B 4E D9 BF 40 51 F0 83 FC 44 E8 B7 A7 21 26 DE A9 B9 CE E5 05 F7 A5 46 FE 91 0F 59 2B 90 E7
RndA_rot_enc: 9F 3D E8 90 37 2E 7B F6 1E BA AC 29 6E 94 1C 9E
RndA_rot: 7B BA 48 4D 01 14 CE 4D 8E 33 A5 1B 0F 00 E9 87
As you can see RndA (line 5) rotated 1 byte to the left is equal to RndA_rot (last line). With this authentication is completed.

Compute a signature with private key in Server Key Exchange

On https://tls.ulfheim.net/ there is an example showing how to compute a signature in the section of " Server Key Exchange ".
https://i.ibb.co/Y7fbkDw/1.jpg (This image shows the Server Key Exchange section on the website that I refer to. Could not embed the image in this post.)
Whatever I try I dont get the same output as the one on that website, I dont understand why.
I've tried storing the same data in two different ways, then use the same openssl command that they use on their example. None of the methods gave the same output.
Method 1.
char hex[] = { 0x00, 0x01, 0x02, 0x03, 0x04, 0x05, 0x06, 0x07, 0x08, 0x09, 0x0a, 0x0b, 0x0c, 0x0d, 0x0e, 0x0f, 0x10, 0x11, 0x12, 0x13, 0x14, 0x15, 0x16, 0x17, 0x18, 0x19, 0x1a, 0x1b, 0x1c, 0x1d, 0x1e, 0x1f, 0x70, 0x71, 0x72, 0x73, 0x74, 0x75, 0x76, 0x77, 0x78, 0x79, 0x7a, 0x7b, 0x7c, 0x7d, 0x7e, 0x7f, 0x80, 0x81, 0x82, 0x83, 0x84, 0x85, 0x86, 0x87, 0x88, 0x89, 0x8a, 0x8b, 0x8c, 0x8d, 0x8e, 0x8f, 0x20, 0x9f, 0xd7, 0xad, 0x6d, 0xcf, 0xf4, 0x29, 0x8d, 0xd3, 0xf9, 0x6d, 0x5b, 0x1b, 0x2a, 0xf9, 0x10, 0xa0, 0x53, 0x5b, 0x14, 0x88, 0xd7, 0xf8, 0xfa, 0xbb, 0x34, 0x9a, 0x98, 0x28, 0x80, 0xb6, 0x15 };
ofstream myfile("c:/hex1.txt", ios::binary);
myfile.write(hex, sizeof hex);
then:
openssl dgst -hex -sign server.key -sha256 hex1.txt
Method 2.
I had this data stored in hex2.txt (as ASCII):
\x00\x01\x02\x03\x04\x05\x06\x07\x08\x09\x0a\x0b\x0c\x0d\x0e\x0f\x10\x11\x12\x13\x14\x15\x16\x17\x18\x19\x1a\x1b\x1c\x1d\x1e\x1f\x70\x71\x72\x73\x74\x75\x76\x77\x78\x79\x7a\x7b\x7c\x7d\x7e\x7f\x80\x81\x82\x83\x84\x85\x86\x87\x88\x89\x8a\x8b\x8c\x8d\x8e\x8f\x20\x9f\xd7\xad\x6d\xcf\xf4\x29\x8d\xd3\xf9\x6d\x5b\x1b\x2a\xf9\x10\xa0\x53\x5b\x14\x88\xd7\xf8\xfa\xbb\x34\x9a\x98\x28\x80\xb6\x15
then:
openssl dgst -hex -sign server.key -sha256 hex2.txt
Method 1
You left out curve_info. The applicable RFC 4492 section 5.4 updated for TLS1.2 by RFC 5246 appendix A.7 actually defines the signature to be over in effect client_random + server_random + SKX_params where SKX_params is type ServerECDHParams and consists of ECParameters curve_params and ECPoint public -- these are what ulfheim labels Curve Info and Public Key.
With the correct data I get the correct result:
$ od -tx1 70148855.bin
0000000 00 01 02 03 04 05 06 07 08 09 0a 0b 0c 0d 0e 0f
0000020 10 11 12 13 14 15 16 17 18 19 1a 1b 1c 1d 1e 1f
0000040 70 71 72 73 74 75 76 77 78 79 7a 7b 7c 7d 7e 7f
0000060 80 81 82 83 84 85 86 87 88 89 8a 8b 8c 8d 8e 8f
0000100 03 00 1d 20 9f d7 ad 6d cf f4 29 8d d3 f9 6d 5b
0000120 1b 2a f9 10 a0 53 5b 14 88 d7 f8 fa bb 34 9a 98
0000140 28 80 b6 15
$ openssl sha256 <70148855.bin -sign $privkey -hex
(stdin)= 0402b661f7c191ee59be45376639bdc3d4bb81e115ca73c8348b525b0d2338aa144667ed9431021412cd9b844cba29934aaacce873414ec11cb02e272d0ad81f767d33076721f13bf36020cf0b1fd0ecb078de1128beba0949ebece1a1f96e209dc36e4fffd36b673a7ddc1597ad4408e485c4adb2c873841249372523809e4312d0c7b3522ef983cac1e03935ff13a8e96ba681a62e40d3e70a7ff35866d3d9993f9e26a634c81b4e71380fcdd6f4e835f75a6409c7dc2c07410e6f87858c7b94c01c2e32f291769eacca71643b8b98a963df0a329bea4ed6397e8cd01a110ab361ac5bad1ccd840a6c8a6eaa001a9d7d87dc3318643571226c4dd2c2ac41fb
Added: BTW this only works because the signature scheme used for RSA in TLS1.2 and below, namely the scheme that was 'block type 1' in PKCS1v1 and now is RSASSA-PKCS1-v1_5 in PKCS1v2, is deterministic. Most digital signature schemes are not, including the RSA-PSS scheme used in TLS1.3, and you cannot check or test a signature by comparing it to another signature. You can only use the verification method provided by the scheme.
Method 2
is completely incorrect. \x00 etc is a notation used in source code for C and C++ (as you did in Method 1) and a few other languages like Java, JS/ES, and Python, as well as certain tools like the printf command in Unix and the awk program. But it does not work other places, and in particular does not work in files read by OpenSSL (at least for data; it might work in the config file, I'd have to check). Your method two (hashes and) signs the bytes 0x5c 0x78 0x30 0x30 0x5c 0x78 0x30 0x31 which represent the characters \ x 0 0 \ x 0 1 etc, not the bytes 0x00 0x01 etc, and unsurprisingly this is completely different and wrong.

objdump showing wrong start and end address for functions

For testing purposes, I modified the PLT stub that is generated by the llvm linker, lld.
The stub before was:
0xff, 0x25, 0x00, 0x00, 0x00, 0x00, // jmpq *got(%rip)
0x68, 0x00, 0x00, 0x00, 0x00, // pushq <relocation index>
0xe9, 0x00, 0x00, 0x00, 0x00 // jmpq plt[0]
Linking a program with this (original) stub and inspecting it with objdump yields something like this:
00000000002012d0 <printf#plt>:
2012d0: ff 25 62 0d 00 00 jmpq *0xd62(%rip) # 202038 <__TMC_END__+0x28>
2012d6: 68 02 00 00 00 pushq $0x2
2012db: e9 c0 ff ff ff jmpq 2012a0 <_fini+0x10>
I modified the PLT stub by simply adding a NOP at the end:
0xff, 0x25, 0x00, 0x00, 0x00, 0x00, // jmpq *got(%rip)
0x68, 0x00, 0x00, 0x00, 0x00, // pushq <relocation index>
0xe9, 0x00, 0x00, 0x00, 0x00, // jmpq plt[0]
0x0f, 0x1f, 0x40, 0x00 // nop
I made sure to modify the PltEntrySize variable so that it reflects the change in size. Linking and running programs with this modification seems to work just fine.
However, when I try to inspect the disassembly of a linked program with objdump, I see something strange:
00000000002012d0 <printf#plt>:
2012d0: cc int3
2012d1: ff (bad)
2012d2: ff (bad)
2012d3: ff 0f decl (%rdi)
2012d5: 1f (bad)
2012d6: 40 00 ff add %dil,%dil
2012d9: 25 5a 0d 00 00 and $0xd5a,%eax
2012de: 68 02 00 00 00 pushq $0x2
2012e3: e9 b8 ff ff ff jmpq 2012a0 <_fini+0x10>
2012e8: 0f 1f 40 00 nopl 0x0(%rax)
The PLT stub's address is interpreted by objdump to be at 0x2012d0, but the real printf#plt address is at 0x2012d8! This is confirmed by readelf -s:
Symbol table '.dynsym' contains 7 entries:
Num: Value Size Type Bind Vis Ndx Name
...
6: 00000000002012d8 0 FUNC GLOBAL DEFAULT UND printf#GLIBC_2.2.5 (2)
Where does objdump get its information from? It could very well be that I forgot to modify something in the linker.
For testing purposes, I modified the PLT stub that is generated by the llvm linker, lld.
The size and layout of the plt entry is set in stone by the ABI (see pg. 79), and can not be changed.
Linking and running programs with this modification seems to work just fine.
I doubt any non-trivial program will run correctly with your modification -- the dynamic loader assumes ABI plt layout, and should crash and burn when given bogus plt.

Custom HID device HID report descriptor report count

I simply want to send from device to host with two report ID.These reports must have different Report Count(First report id has 4 report count, second report id has 40).This is what I have done so far:
//14 bytes
0x06, 0x00, 0xff, // USAGE_PAGE (Vendor Defined Page 1)
0x09, 0x01, // USAGE (Vendor Usage 1)
0xa1, 0x01, // COLLECTION (Application)
// -------- common global items ---------
0x15, 0x00, // LOGICAL_MINIMUM (0)
0x26, 0xff, 0x00, // LOGICAL_MAXIMUM (255)
0x75, 0x08, // REPORT_SIZE (8)
// 10 bytes | Input message 1 (sent from device to host)
0x85, 5, // Global Report ID (cannot be 0)
0x95, 4, // Global Report Count (number of Report Size fields)
0x19, 0x01, // USAGE_MINIMUM (Vendor Usage 1)
0x29, 5, // USAGE_MAXIMUM (Vendor Usage 64)
0x81, 0x02, // Main Input (data, array, absolute)
// 10 bytes | Input message 1 (sent from device to host)
0x85, 6, // Global Report ID (cannot be 0)
0x95, 40, // Global Report Count (number of Report Size fields)
0x19, 0x01, // USAGE_MINIMUM (Vendor Usage 1)
0x29, 41, // USAGE_MAXIMUM (Vendor Usage 64)
0x81, 0x02, // Main Input (data, array, absolute)
0xC0
But first report id is sending 40 bayt.Where is my mistake?
HID Terminal output:
R 02 0C 16 20 2A 34 3E 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
R 01 0B 15 1F 29 34 3E 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
I believe #Nipo has given you the correct answer: the report descriptor indicates what each report should look like, but it is still the responsibility of your code to send reports with the correct length specified.
For report id 5 that would be 5 (1 for the report id + 4 for the payload), and
for report it 6 it would be 41 (1 for the report id + 40 for the payload).
BTW, your report descriptor may need a little tweaking. As it stands, it decodes as:
//--------------------------------------------------------------------------------
// Decoded Application Collection
//--------------------------------------------------------------------------------
PROGMEM char usbHidReportDescriptor[] =
{
0x06, 0x00, 0xFF, // (GLOBAL) USAGE_PAGE 0xFF00 Vendor-defined
0x09, 0x01, // (LOCAL) USAGE 0xFF000001 <-- Warning: Undocumented usage
0xA1, 0x01, // (MAIN) COLLECTION 0x00000001 Application (Usage=0xFF000001: Page=Vendor-defined, Usage=, Type=)
0x15, 0x00, // (GLOBAL) LOGICAL_MINIMUM 0x00 (0) <-- Redundant: LOGICAL_MINIMUM is already 0 <-- Info: Consider replacing 15 00 with 14
0x26, 0xFF, 0x00, // (GLOBAL) LOGICAL_MAXIMUM 0x00FF (255)
0x75, 0x08, // (GLOBAL) REPORT_SIZE 0x08 (8) Number of bits per field
0x85, 0x05, // (GLOBAL) REPORT_ID 0x05 (5)
0x95, 0x04, // (GLOBAL) REPORT_COUNT 0x04 (4) Number of fields
0x19, 0x01, // (LOCAL) USAGE_MINIMUM 0xFF000001 <-- Warning: Undocumented usage
0x29, 0x05, // (LOCAL) USAGE_MAXIMUM 0xFF000005 <-- Warning: Undocumented usage
0x81, 0x02, // (MAIN) INPUT 0x00000002 (4 fields x 8 bits) 0=Data 1=Variable 0=Absolute 0=NoWrap 0=Linear 0=PrefState 0=NoNull 0=NonVolatile 0=Bitmap
0x85, 0x06, // (GLOBAL) REPORT_ID 0x06 (6)
0x95, 0x28, // (GLOBAL) REPORT_COUNT 0x28 (40) Number of fields
0x19, 0x01, // (LOCAL) USAGE_MINIMUM 0xFF000001 <-- Warning: Undocumented usage
0x29, 0x29, // (LOCAL) USAGE_MAXIMUM 0xFF000029 <-- Warning: Undocumented usage
0x81, 0x02, // (MAIN) INPUT 0x00000002 (40 fields x 8 bits) 0=Data 1=Variable 0=Absolute 0=NoWrap 0=Linear 0=PrefState 0=NoNull 0=NonVolatile 0=Bitmap
0xC0, // (MAIN) END_COLLECTION Application
};
//--------------------------------------------------------------------------------
// Vendor-defined inputReport 05 (Device --> Host)
//--------------------------------------------------------------------------------
typedef struct
{
uint8_t reportId; // Report ID = 0x05 (5)
uint8_t VEN_VendorDefined0001; // Usage 0xFF000001: , Value = 0 to 255
uint8_t VEN_VendorDefined0002; // Usage 0xFF000002: , Value = 0 to 255
uint8_t VEN_VendorDefined0003; // Usage 0xFF000003: , Value = 0 to 255
uint8_t VEN_VendorDefined0004; // Usage 0xFF000004: , Value = 0 to 255
// Usage 0xFF000005 Value = 0 to 255 <-- Ignored: REPORT_COUNT (4) is too small
} inputReport05_t;
//--------------------------------------------------------------------------------
// Vendor-defined inputReport 06 (Device --> Host)
//--------------------------------------------------------------------------------
typedef struct
{
uint8_t reportId; // Report ID = 0x06 (6)
uint8_t VEN_VendorDefined0001; // Usage 0xFF000001: , Value = 0 to 255
uint8_t VEN_VendorDefined0002; // Usage 0xFF000002: , Value = 0 to 255
uint8_t VEN_VendorDefined0003; // Usage 0xFF000003: , Value = 0 to 255
uint8_t VEN_VendorDefined0004; // Usage 0xFF000004: , Value = 0 to 255
uint8_t VEN_VendorDefined0005; // Usage 0xFF000005: , Value = 0 to 255
uint8_t VEN_VendorDefined0006; // Usage 0xFF000006: , Value = 0 to 255
uint8_t VEN_VendorDefined0007; // Usage 0xFF000007: , Value = 0 to 255
uint8_t VEN_VendorDefined0008; // Usage 0xFF000008: , Value = 0 to 255
uint8_t VEN_VendorDefined0009; // Usage 0xFF000009: , Value = 0 to 255
uint8_t VEN_VendorDefined000A; // Usage 0xFF00000A: , Value = 0 to 255
uint8_t VEN_VendorDefined000B; // Usage 0xFF00000B: , Value = 0 to 255
uint8_t VEN_VendorDefined000C; // Usage 0xFF00000C: , Value = 0 to 255
uint8_t VEN_VendorDefined000D; // Usage 0xFF00000D: , Value = 0 to 255
uint8_t VEN_VendorDefined000E; // Usage 0xFF00000E: , Value = 0 to 255
uint8_t VEN_VendorDefined000F; // Usage 0xFF00000F: , Value = 0 to 255
uint8_t VEN_VendorDefined0010; // Usage 0xFF000010: , Value = 0 to 255
uint8_t VEN_VendorDefined0011; // Usage 0xFF000011: , Value = 0 to 255
uint8_t VEN_VendorDefined0012; // Usage 0xFF000012: , Value = 0 to 255
uint8_t VEN_VendorDefined0013; // Usage 0xFF000013: , Value = 0 to 255
uint8_t VEN_VendorDefined0014; // Usage 0xFF000014: , Value = 0 to 255
uint8_t VEN_VendorDefined0015; // Usage 0xFF000015: , Value = 0 to 255
uint8_t VEN_VendorDefined0016; // Usage 0xFF000016: , Value = 0 to 255
uint8_t VEN_VendorDefined0017; // Usage 0xFF000017: , Value = 0 to 255
uint8_t VEN_VendorDefined0018; // Usage 0xFF000018: , Value = 0 to 255
uint8_t VEN_VendorDefined0019; // Usage 0xFF000019: , Value = 0 to 255
uint8_t VEN_VendorDefined001A; // Usage 0xFF00001A: , Value = 0 to 255
uint8_t VEN_VendorDefined001B; // Usage 0xFF00001B: , Value = 0 to 255
uint8_t VEN_VendorDefined001C; // Usage 0xFF00001C: , Value = 0 to 255
uint8_t VEN_VendorDefined001D; // Usage 0xFF00001D: , Value = 0 to 255
uint8_t VEN_VendorDefined001E; // Usage 0xFF00001E: , Value = 0 to 255
uint8_t VEN_VendorDefined001F; // Usage 0xFF00001F: , Value = 0 to 255
uint8_t VEN_VendorDefined0020; // Usage 0xFF000020: , Value = 0 to 255
uint8_t VEN_VendorDefined0021; // Usage 0xFF000021: , Value = 0 to 255
uint8_t VEN_VendorDefined0022; // Usage 0xFF000022: , Value = 0 to 255
uint8_t VEN_VendorDefined0023; // Usage 0xFF000023: , Value = 0 to 255
uint8_t VEN_VendorDefined0024; // Usage 0xFF000024: , Value = 0 to 255
uint8_t VEN_VendorDefined0025; // Usage 0xFF000025: , Value = 0 to 255
uint8_t VEN_VendorDefined0026; // Usage 0xFF000026: , Value = 0 to 255
uint8_t VEN_VendorDefined0027; // Usage 0xFF000027: , Value = 0 to 255
uint8_t VEN_VendorDefined0028; // Usage 0xFF000028: , Value = 0 to 255
// Usage 0xFF000029 Value = 0 to 255 <-- Ignored: REPORT_COUNT (40) is too small
} inputReport06_t;
You may want to try the following instead:
//--------------------------------------------------------------------------------
// Decoded Application Collection
//--------------------------------------------------------------------------------
PROGMEM char usbHidReportDescriptor[] =
{
0x06, 0x00, 0xFF, // (GLOBAL) USAGE_PAGE 0xFF00 Vendor-defined
0x09, 0x01, // (LOCAL) USAGE 0xFF000001 <-- Warning: Undocumented usage
0xA1, 0x01, // (MAIN) COLLECTION 0x00000001 Application (Usage=0xFF000001: Page=Vendor-defined, Usage=, Type=)
0x15, 0x00, // (GLOBAL) LOGICAL_MINIMUM 0x00 (0) <-- Redundant: LOGICAL_MINIMUM is already 0 <-- Info: Consider replacing 15 00 with 14
0x26, 0xFF, 0x00, // (GLOBAL) LOGICAL_MAXIMUM 0x00FF (255)
0x75, 0x08, // (GLOBAL) REPORT_SIZE 0x08 (8) Number of bits per field
0x85, 0x05, // (GLOBAL) REPORT_ID 0x05 (5)
0x95, 0x04, // (GLOBAL) REPORT_COUNT 0x04 (4) Number of fields
0x19, 0x01, // (LOCAL) USAGE_MINIMUM 0xFF000001 <-- Warning: Undocumented usage
0x29, 0x04, // (LOCAL) USAGE_MAXIMUM 0xFF000004 <-- Warning: Undocumented usage
0x81, 0x00, // (MAIN) INPUT 0x00000000 (4 fields x 8 bits) 0=Data 0=Array 0=Absolute 0=Ignored 0=Ignored 0=PrefState 0=NoNull
0x85, 0x06, // (GLOBAL) REPORT_ID 0x06 (6)
0x95, 0x28, // (GLOBAL) REPORT_COUNT 0x28 (40) Number of fields
0x19, 0x01, // (LOCAL) USAGE_MINIMUM 0xFF000001 <-- Warning: Undocumented usage
0x29, 0x28, // (LOCAL) USAGE_MAXIMUM 0xFF000028 <-- Warning: Undocumented usage
0x81, 0x00, // (MAIN) INPUT 0x00000000 (40 fields x 8 bits) 0=Data 0=Array 0=Absolute 0=Ignored 0=Ignored 0=PrefState 0=NoNull
0xC0, // (MAIN) END_COLLECTION Application
};
//--------------------------------------------------------------------------------
// Vendor-defined inputReport 05 (Device --> Host)
//--------------------------------------------------------------------------------
typedef struct
{
uint8_t reportId; // Report ID = 0x05 (5)
uint8_t VEN_VendorDefined[4]; // Value = 0 to 255
} inputReport05_t;
//--------------------------------------------------------------------------------
// Vendor-defined inputReport 06 (Device --> Host)
//--------------------------------------------------------------------------------
typedef struct
{
uint8_t reportId; // Report ID = 0x06 (6)
uint8_t VEN_VendorDefined[40]; // Value = 0 to 255
} inputReport06_t;

Cannot able to verify a signed data

Through smart card using a tool, I am signing data "Hello".
The output what i am getting is in hex format.
14 5F 65 CE 7C 2D 8A 0A FA B0 FB 86 CE 28 90 84
37 2D 04 63 B2 35 FA 40 4A B6 35 C8 90 AF 55 7F
B1 CA FE FD 5B F9 1B 7C DB 74 63 BF 16 5B B3 6D
E8 2D B6 D7 2E 90 AF 0A 5E CF 78 73 E3 37 02 C2
97 0E F9 B3 40 4C 67 CD E4 7C D9 4B D3 C9 86 51
8E 1E 84 81 B4 30 AC 68 96 59 CB 63 E5 C8 28 48
C7 1D E8 E9 FC E8 C9 BE 36 33 0A F0 A9 35 C4 D4
BF 60 66 21 5C 41 8F 48 91 D4 BB AF 75 75 7A B3
2A 8A 28 B8 30 D1 B4 6B 69 23 82 2D 28 77 30 05
D5 C9 AB 41 17 C1 68 6D D9 80 0F F2 C1 FC 32 6E
22 61 27 97 9C DD C3 50 33 AA DB F4 BA 98 29 FA
4F E2 B4 BC C5 9E 90 34 F3 BC 3D 78 01 47 AF 96
20 06 6F F9 41 30 D7 35 52 D3 DE 85 E3 FE 0B B7
15 4D 1A 73 B8 36 F4 A1 59 A2 7E 05 50 8B 52 AC
B4 EF 2D D9 29 9D D9 BB C8 DF F3 67 C5 D1 D9 C0
0C 65 68 A8 12 9B 24 92 4E EB 98 D8 B0 D9 2E 6A
The respective signed data in string format I have saved in file signedData.txt.
._eÎ|-Š.ú°û†Î(„7-.c²5ú#J¶5ȯU±Êþý[ù.|Ûtc¿.[³mè-¶×.¯.^Ïxsã7.—.ù³#LgÍä|ÙKÓɆQŽ.„´0¬h– YËcåÈ(HÇ.èéüèɾ63.ð©5ÄÔ¿`f!\AH‘Ô»¯uuz³*Š(¸0Ñ´ki#‚-(w0.ÕÉ«A.ÁhmÙ€.òÁü2n"a'—œÝÃP3ªÛôº˜)úOâ ´¼Åž4ó¼=x.G¯– .oùA0×5RÓÞ…ãþ.·.M.s¸6ô¡Y¢~.P‹R¬´ï-Ù)Ù»ÈßógÅÑÙÀ.eh¨.›$’Në˜Ø°Ù.j
I am using Windows CryptoAPI for verification of this signed data.
I am opening the file signedData.txt and writing the data into the buffer "
signedBuffer"
Then i am finding my certificate from the store and taking the handle of public key using
hPubKey = CryptImportPublicKeyInfo(hProv, ENCODING_TYPE, &pCertContext->pCertInfo->SubjectPublicKeyInfo, &hCertPubKey);
Then I am creating a hash object using CALG_SHA_256
hObject = CryptCreateHash(hProv, CALG_SHA_256, 0, 0, &hHashObject);
Then adding my data with the hash object
CryptHashData(hhObject, Buffer, BufferLen, 0); // char Buffer = "Hello";
// int BufferLen = strlen(Buffer );
In the last step i am verifying the signature using
CryptVerifySignature(hObject,signedBuffer,signedBufferBytes,hPubKey,NULL,0); /* signedBuffer contains signed data in string format */
/*signedBufferBytes has no. of bytes in the buffer */
but this signed data is not verified. Error code i am getting is 0x80090006 - Invalid Signature.
Public key In hex format is :
30 82 01 0a 02 82 01 01 00 b8 f8 dc 2c a5 03 84
ba 72 c6 0e 03 89 51 6f 39 a8 41 e3 49 b3 f7 14
31 d3 43 b7 fc 1f 61 c2 43 b0 77 9e 19 af f4 8b
02 99 72 c1 17 21 1d 23 da ab 53 54 74 33 e4 ab
9d 82 d2 68 33 9a b5 9c 99 cb f0 12 e0 f8 44 4f
e8 91 3f 60 ed ca fa 3b 40 bd 64 50 92 d3 c2 c1
48 ad 24 3e ca 64 2c 50 a9 01 b5 9f f4 a4 46 e5
84 e9 a4 87 41 86 a1 7a 7f fc a6 f0 e0 b1 de f0
c1 f2 5d c8 84 16 15 4d e4 df 43 43 3a cd ad ec
eb af 1b 9c a7 5c 40 dc ae 1f 71 6e a4 c6 0f dd
3e 3c c8 0d 25 4c 61 74 df aa ed b5 d5 b9 06 6a
8e b0 b7 c0 e6 c9 bf db b1 07 2e a2 76 aa e7 28
1c 8d 32 4e b3 58 1d 34 89 96 ed 3e da 29 e0 1e
c9 c2 2e 18 19 a6 ba 91 32 b7 85 97 87 92 16 c5
01 b4 4f 57 5c 56 1b f5 f4 6a 29 6b 2e 51 8b f5
4c 6f b8 fd cb 09 d9 fd 66 09 04 49 b6 ba 7e d0
af 70 3a 51 41 5a a5 04 bf 02 03 01 00 01
The Signature buffer now I am using is:
BYTE bSignatureBuf[] = {
0x6A, 0x2E, 0xD9, 0xB0, 0xD8, 0x98, 0xEB, 0x4E, 0x92, 0x24, 0x9B, 0x12, 0xA8, 0x68, 0x65, 0x0C,
0xC0, 0xD9, 0xD1, 0xC5, 0x67, 0xF3, 0xDF, 0xC8, 0xBB, 0xD9, 0x9D, 0x29, 0xD9, 0x2D, 0xEF, 0xB4,
0xAC, 0x52, 0x8B, 0x50, 0x05, 0x7E, 0xA2, 0x59, 0xA1, 0xF4, 0x36, 0xB8, 0x73, 0x1A, 0x4D, 0x15,
0xB7, 0x0B, 0xFE, 0xE3, 0x85, 0xDE, 0xD3, 0x52, 0x35, 0xD7, 0x30, 0x41, 0xF9, 0x6F, 0x06, 0x20,
0x96, 0xAF, 0x47, 0x01, 0x78, 0x3D, 0xBC, 0xF3, 0x34, 0x90, 0x9E, 0xC5, 0xBC, 0xB4, 0xE2, 0x4F,
0xFA, 0x29, 0x98, 0xBA, 0xF4, 0xDB, 0xAA, 0x33, 0x50, 0xC3, 0xDD, 0x9C, 0x97, 0x27, 0x61, 0x22,
0x6E, 0x32, 0xFC, 0xC1, 0xF2, 0x0F, 0x80, 0xD9, 0x6D, 0x68, 0xC1, 0x17, 0x41, 0xAB, 0xC9, 0xD5,
0x05, 0x30, 0x77, 0x28, 0x2D, 0x82, 0x23, 0x69, 0x6B, 0xB4, 0xD1, 0x30, 0xB8, 0x28, 0x8A, 0x2A,
0xB3, 0x7A, 0x75, 0x75, 0xAF, 0xBB, 0xD4, 0x91, 0x48, 0x8F, 0x41, 0x5C, 0x21, 0x66, 0x60, 0xBF,
0xD4, 0xC4, 0x35, 0xA9, 0xF0, 0x0A, 0x33, 0x36, 0xBE, 0xC9, 0xE8, 0xFC, 0xE9, 0xE8, 0x1D, 0xC7,
0x48, 0x28, 0xC8, 0xE5, 0x63, 0xCB, 0x59, 0x96, 0x68, 0xAC, 0x30, 0xB4, 0x81, 0x84, 0x1E, 0x8E,
0x51, 0x86, 0xC9, 0xD3, 0x4B, 0xD9, 0x7C, 0xE4, 0xCD, 0x67, 0x4C, 0x40, 0xB3, 0xF9, 0x0E, 0x97,
0xC2, 0x02, 0x37, 0xE3, 0x73, 0x78, 0xCF, 0x5E, 0x0A, 0xAF, 0x90, 0x2E, 0xD7, 0xB6, 0x2D, 0xE8,
0x6D, 0xB3, 0x5B, 0x16, 0xBF, 0x63, 0x74, 0xDB, 0x7C, 0x1B, 0xF9, 0x5B, 0xFD, 0xFE, 0xCA, 0xB1,
0x7F, 0x55, 0xAF, 0x90, 0xC8, 0x35, 0xB6, 0x4A, 0x40, 0xFA, 0x35, 0xB2, 0x63, 0x04, 0x2D, 0x37,
0x84, 0x90, 0x28, 0xCE, 0x86, 0xFB, 0xB0, 0xFA, 0x0A, 0x8A, 0x2D, 0x7C, 0xCE, 0x65, 0x5F, 0x14
};
First of all, you should not treat your binary data as text, as given in the comments.
Even more important is that your signature has been generated using PKCS#1 SHA-1 and your hash function specifies SHA-256 - presumably using PKCS#1 [^1]. You have to use identical hash function during signature generation and verification.
You can see the hash function used by looking up the OID in the ASN.1 description after decrypting your signature with the public key (in case you are wondering how to find out the hash function used).
Also note this little remark from the API:
The native cryptography API uses little-endian byte order (ed: for the signature bytes)
note that the non-M$ world uses big-endian byte order.
[^1]: as usual Microsoft does not specify the exact protocols they are applying [^2]
[^2]: this should view correctly once stackoverflow makes the tiny effort to implement markdown footnotes