Verify ECDSA signature with MbedTLS 3.X - signing

A client sends to me a message signed with a private key, type ECDSA secp256R1. I'm in possession of a leaf certificate, in DER format, provided by the client. In addition, I also have the raw message and a sha256 digest of the msg.
I have created a struct where to store all the required info for the verification, with the idea of providing a public API in my application:
struct SignatureVerifyData {
unsigned char *msg;
unsigned char *hash; // digest sha256 of msg
unsigned char *cert; // leaf cert in DER
unsigned char *signature;
size_t msg_len;
size_t hash_len;
size_t cert_len;
size_t signature_len;
};
I'm reading the ecdsa.c example from MbedTLS, but in this case the cert is generated in the same example, I can use mbedtls_x509_crt_parse_der() to load my leaf cert, but then, should I to move it to a mbedtls_ecdsa_context object to use with mbedtls_ecdsa_read_signature()?
Should I use other way to load the leaf cert?
Confused on how to use the group and point objects, or if I need to use them at all.
#define MBEDTLS_HAVE_ASM
#define MBEDTLS_HAVE_TIME
#define MBEDTLS_ALLOW_PRIVATE_ACCESS
#define MBEDTLS_PLATFORM_C
#define MBEDTLS_ECP_DP_SECP256R1_ENABLED
#define MBEDTLS_KEY_EXCHANGE_ECDHE_ECDSA_ENABLED
#define MBEDTLS_SSL_PROTO_TLS1_2
#define MBEDTLS_AES_C
#define MBEDTLS_ASN1_PARSE_C
#define MBEDTLS_ASN1_WRITE_C
#define MBEDTLS_BIGNUM_C
#define MBEDTLS_CIPHER_C
#define MBEDTLS_CTR_DRBG_C
#define MBEDTLS_ECDH_C
#define MBEDTLS_ECDSA_C
#define MBEDTLS_ECP_C
#define MBEDTLS_ENTROPY_C
#define MBEDTLS_GCM_C
#define MBEDTLS_MD_C
#define MBEDTLS_NET_C
#define MBEDTLS_OID_C
#define MBEDTLS_PK_C
#define MBEDTLS_PK_PARSE_C
#define MBEDTLS_SHA224_C
#define MBEDTLS_SHA256_C
#define MBEDTLS_SHA384_C
#define MBEDTLS_SHA512_C
#define MBEDTLS_SSL_CLI_C
#define MBEDTLS_SSL_SRV_C
#define MBEDTLS_SSL_TLS_C
#define MBEDTLS_X509_CRT_PARSE_C
#define MBEDTLS_X509_USE_C
#define MBEDTLS_BASE64_C
#define MBEDTLS_PEM_PARSE_C
#define MBEDTLS_AES_ROM_TABLES
#define MBEDTLS_MPI_MAX_SIZE 48 // 384-bit EC curve = 48 bytes
#define MBEDTLS_ECP_WINDOW_SIZE 2
#define MBEDTLS_ECP_FIXED_POINT_OPTIM 0
#define MBEDTLS_ECP_NIST_OPTIM
#define MBEDTLS_ENTROPY_MAX_SOURCES 2
#define MBEDTLS_SSL_CIPHERSUITES \
MBEDTLS_TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, \
MBEDTLS_TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256
#define MBEDTLS_SSL_IN_CONTENT_LEN 1024
#define MBEDTLS_SSL_OUT_CONTENT_LEN 1024
#include "mbedtls/check_config.h"

mbedtls_x509_crt_parse_der constructs an object of type mbedtls_x509_crt. This structure has a field called pk which contains the public key. Call mbedtls_pk_verify to verify the signature.
Here's the general idea of the code to parse the certificate, calculate the hash and verify the signature. Untested code, typed directly into my browser. Error checking omitted, be sure to check that all the function calls succeed.
#include <mbedtls/md.h>
#include <mbedtls/pk.h>
#include <mbedtls/x509_crt.h>
mbedtls_x509_crt crt;
mbedtls_x509_init(&crt);
mbedtls_x509_crt_parse_der(&crt, cert, cert_len);
const mbedtls_md_info_t *md_info = mbedtls_md_info_from_type(MBEDTLS_MD_SHA256);
hash_len = mbedtls_md_get_size(md_info);
hash = malloc(hash_len);
mbedtls_md(md_info, msg, msg_len, hash);
mbedtls_pk_verify(&crt->pk, MBEDTLS_MD_SHA256, hash, hash_len, signature, signature_len);
mbedtls_x509_free(&crt);
free(hash);

Related

Redefinition of z_stream_s, gz_header_s, gzfile_s in zlib.h for objective c after upgrade to xcode13

After upgrade to xcode13.4, redefinition of struct z_stream_s, gz_header_s, gzfile_s in zlib.h.
I added to guard to avoid the redefinition in the starting and ending
#ifndef ZLIB_H
#define ZLIB_H
------ coding
#endif
Code where the error occurring
typedef struct z_stream_s {
z_const Bytef *next_in; /* next input byte */
uInt avail_in; /* number of bytes available at next_in */
uLong total_in; /* total number of input bytes read so far */
Bytef *next_out; /* next output byte will go here */
uInt avail_out; /* remaining free space at next_out */
uLong total_out; /* total number of bytes output so far */
z_const char *msg; /* last error message, NULL if no error */
struct internal_state FAR *state; /* not visible by applications */
alloc_func zalloc; /* used to allocate the internal state */
free_func zfree; /* used to free the internal state */
voidpf opaque; /* private data object passed to zalloc and zfree */
int data_type; /* best guess about the data type: binary or text
for deflate, or the decoding state for inflate */
uLong adler; /* Adler-32 or CRC-32 value of the uncompressed data */
uLong reserved; /* reserved for future use */
} z_stream;
typedef struct z_stream_s FAR *z_streamp;
typedef struct gz_header_s {
int text; /* true if compressed data believed to be text */
uLong time; /* modification time */
int xflags; /* extra flags (not used when writing a gzip file) */
int os; /* operating system */
Bytef *extra; /* pointer to extra field or Z_NULL if none */
uInt extra_len; /* extra field length (valid if extra != Z_NULL) */
uInt extra_max; /* space at extra (only when reading header) */
Bytef *name; /* pointer to zero-terminated file name or Z_NULL */
uInt name_max; /* space at name (only when reading header) */
Bytef *comment; /* pointer to zero-terminated comment or Z_NULL */
uInt comm_max; /* space at comment (only when reading header) */
int hcrc; /* true if there was or will be a header crc */
int done; /* true when done reading gzip header (not used
when writing a gzip file) */
} gz_header;
struct gzFile_s {
unsigned have;
unsigned char *next;
z_off64_t pos;
};
I replaced the latest zlib.h also but the same redefinition error occurring again.
These files are used to generate the qrcode
redefintion issue occurring in all these files
When i click the error occur it goes to the line where struct z_stream_s is defined
typedef struct z_stream_s {
In that error shows that it has been already defined in the file unzip.c
in the unzip.c when i look into that zlib.h has been include in the both unzip.h and unzip.c
in unzip.h
#ifndef _unz_H
#define _unz_H
#ifdef __cplusplus
extern "C" {
#endif
#ifndef _ZLIB_H
#include "zlib.h"
#endif
#ifndef _ZLIBIOAPI_H
#include "ioapi.h"
#endif
in unzip.c
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include "zlib.h"
#include "unzip.h"
#ifdef STDC
# include <stddef.h>
# include <string.h>
# include <stdlib.h>
#endif
#ifdef NO_ERRNO_H
extern int errno;
#else
# include <errno.h>
#endif
These files are used to generate the QRCODE
and the redefinition error continues on file by file zip.c, zip.h, pngpriv.h, pngstruct.h like that it goes on
Note:The same project is working fine in Xcode 12.4, but in latest Xcode 13 Its shows the redefinition error
Please help me to resolve this issue.
Thanks in advance.

CMake with crypt(3)

I trying to make a crypt(3) sample with CMake.
#define _GNU_SOURCE
#include <stdio.h>
#include <string.h>
#include <unistd.h>
#include <crypt.h>
/* To compile: $ gcc check.c -lcrypt -o check */
int main(void) {
/* Hashed form of "GNU libc manual". */
char *pass = "$1$/iSaq7rB$EoUw5jJPPvAPECNaaWzMK/";
/* Read in the user’s password and encrypt it,
passing the expected password in as the salt. */
char *result = crypt(getpass("Password:"), pass);
/* Test the result. */
int ok = strcmp (result, pass) == 0;
puts(ok ? "Access granted." : "Access denied.");
return ok ? 0 : 1;
}
To build it it should be pass the -lcrypt option to gcc.
My CMakeLists.txt looks like:
project(cryptexample)
set(SOURCE_FILES check.c)
add_executable(check ${SOURCE_FILES})
How can I pass this option and build it?
Something like:
target_link_libraries(check crypt)
Source: https://cmake.org/cmake/help/latest/command/target_link_libraries.html

CocoaLumberjack's Log Level switches to verbose

I'm using the CocoaLumberjack logging framework 2.0.0 for logging with different levels. In my Prefix.pch (I know that this file is deprecated, but it should work nevertheless) I include Cocoalumberjack and set the global log level as suggested here:
#ifdef DEBUG
static const DDLogLevel ddLogLevel = DDLogLevelDebug;
#else
static const DDLogLevel ddLogLevel = DDLogLevelWarn;
#endif
I have a DDLogVerbose statement on a few methods, that should not be logged by default. Problem: However, they are getting logged.
Inspecting the ddLogLevel in an init-function shows 00001111, which equals DDLogLevelDebug. Nevertheless, a verbose logging statement directly after this is executed. (1)
Preprocessing the line DDLogVerbose(#"I AM VERBOSE") shows this code:
do {
if(DDLogLevelVerbose & DDLogFlagVerbose)
[DDLog log : __objc_yes
level : DDLogLevelVerbose
flag : DDLogFlagVerbose
context : 0
file : "....m"
function : __PRETTY_FUNCTION__
line : 59
tag : ((void *)0)
format : (#"I AM VERBOSE")];
} while(0);
which means, that the LogLevel after preprocessing is Verbose. (2) I found out that this level is the default in CocoaLumberjack in case, no log level is defined:
#ifndef LOG_LEVEL_DEF
#ifdef ddLogLevel
#define LOG_LEVEL_DEF ddLogLevel
#else
#define LOG_LEVEL_DEF DDLogLevelVerbose
#endif
#endif
But: Debugging this shows that the first path is executed, i.e. LOG_LEVEL_DEF (which is checked against the level of the statement to determine if it should be logged or not) is assigned the correct level (Debug).
Question: I didn't find out, why (1) shows the LogLevel Debug and, after preprocessing, it switched to Verbose (2). Could this be a matter of the order in which headers are included? Or am I missing some important point?
I didn't solve this issue, so I wrote my own header file for logging:
// Create Logging Messages by calling the functions:
// * DDLogFatal(...)
// * DDLogError(...)
// * DDLogWarn(...)
// * DDLogInfo(...)
// * DDLogDebug(...)
// * DDLogTrace(...)
// * DDLogTrace()
// Only the functions that match Log Level (defined beneath) and above this level will lead to an output.
//
// NOTE: For this file to work, the option "Treat warnings as errors" must be turned off!
/*********************************
* CURRENT LOG LEVEL ***
*********************************/
#define LOG_LEVEL LOG_LEVEL_DEBUG
/* Default Log Level */
#ifndef LOG_LEVEL
#ifdef DEBUG
#define LOG_LEVEL LOG_LEVEL_DEBUG
#else
#define LOG_LEVEL LOG_LEVEL_WARN
#endif
#endif
/* List of Log Levels */
#define LOG_LEVEL_OFF 0 // 0000 0000
#define LOG_LEVEL_FATAL 1 // 0000 0001
#define LOG_LEVEL_ERROR 3 // 0000 0011
#define LOG_LEVEL_WARN 7 // 0000 0111
#define LOG_LEVEL_INFO 15 // 0000 1111
#define LOG_LEVEL_DEBUG 31 // 0001 1111
#define LOG_LEVEL_TRACE 63 // 0011 1111
#define LOG_FLAG_FATAL 1 // 0000 0001
#define LOG_FLAG_ERROR 2 // 0000 0010
#define LOG_FLAG_WARN 4 // 0000 0100
#define LOG_FLAG_INFO 8 // 0000 1000
#define LOG_FLAG_DEBUG 16 // 0001 0000
#define LOG_FLAG_TRACE 32 // 0010 0000
#if (LOG_LEVEL & LOG_FLAG_FATAL) > 0
#define DDLogFatal(...) ALog(#"FATAL", __VA_ARGS__)
#else
#define DDLogFatal(...)
#endif
#if (LOG_LEVEL & LOG_FLAG_ERROR) > 0
#define DDLogError(...) ALog(#"ERROR", __VA_ARGS__)
#else
#define DDLogError(...)
#endif
#if (LOG_LEVEL & LOG_FLAG_WARN) > 0
#define DDLogWarn(...) ALog(#"WARNING", __VA_ARGS__)
#else
#define DDLogWarn(...)
#endif
#if (LOG_LEVEL & LOG_FLAG_INFO) > 0
#define DDLogInfo(...) ALog(#"INFO", __VA_ARGS__)
#else
#define DDLogInfo(...)
#endif
#if (LOG_LEVEL & LOG_FLAG_DEBUG) > 0
#define DDLogDebug(...) ALog(#"DEBUG", __VA_ARGS__)
#else
#define DDLogDebug(...)
#endif
#if (LOG_LEVEL & LOG_FLAG_TRACE) > 0
#define DDLogTrace(...) ALog(#"TRACE", __VA_ARGS__)
#define DDLogEntry() ALog(#"TRACE", #"->")
#else
#define DDLogTrace(...)
#define DDLogEntry()
#endif
#define ALog(logLevel, fmt, ...) NSLog((#"%s [Line %d] %#: " fmt), __PRETTY_FUNCTION__, __LINE__, logLevel, ##__VA_ARGS__)
Include this file wherever Logging is needed. Hope this helps someone!
So I'm not sure if this is the same issue you were running into, but I had a similar symptom, i.e. my log levels being ignored. What was happening for me is that the cocoa lumberjack folks made it easier in v2 for new users to get started by not having to specify a log level at all to get the framework to work.
As per the lumberjack docs, to actually use ddLogLevel I needed to #define it before importing the CocoaLumberjack.h file:
Using ddLogLevel to start using
the library is now optional. If you define it add #define
LOG_LEVEL_DEF ddLogLevel before #import
and make change its type to
DDLogLevel
In my case, I'm doing that in the .pch file, so it looks like:
// ProjectX.pch
#define LOG_LEVEL_DEF ddLogLevel // this is the crucial bit!
#import "CocoaLumberjack/CocoaLumberjack.h"
// Then the normal definitions...
#ifdef DEBUG
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunused-variable"
static DDLogLevel ddLogLevel = DDLogLevelWarning;
#pragma clang diagnostic pop
#else
static const DDLogLevel ddLogLevel = DDLogLevelWarning;
#endif
#define LOG_LEVEL_DEF ddLogLevel
CocoaLumberjack has 4 log levels
Error
Warning
Info
Verbose
The "ddLogLevel" determines which logs are to be executed and which to be ignored.
If you do not want DDLogVerbose to be executed change to lower levels like Info.
Change your DEBUG macro as follows
#ifdef DEBUG
static const int ddLogLevel = LOG_LEVEL_INFO;
#else
static const int ddLogLevel = LOG_LEVEL_ERROR;
#endif
Hope this solves your issue.

TLS v1.2 handshake fails after client's Change cipher spec and Encrypted Handshake message

I have a PSK Server and Client example using Open SSL that work very well with one another. However, what I need to do is make my client using PolarSSL/mBedTLS talk to the server. I am experiencing handshake failure once the client sends ChangeCipherSpec and EncryptedHandshakeMessage. Any ideas what could be wrong?
I have used https://bitbucket.org/tiebingzhang/tls-psk-server-client-example/overview as reference.
Sample mBedTLS/PolarSSL code is as below:
static const unsigned char *psk_identity = "Client_identity";
static const unsigned char *psk_key = "1A1A1A1A1A1A1A1A";
ssl_set_endpoint(&context,SSL_IS_CLIENT);
ssl_set_authmode(&context, SSL_VERIFY_NONE );
ssl_set_rng(&context, random_vector_generate, NULL);
ssl_set_ciphersuites(&context, default_ciphers);
ssl_set_bio(&context, transport_read,
NULL,
transport_write,
NULL);
ssl_set_psk(&context, psk_key, strlen((char *)psk_key), psk_identity, strlen((char *)psk_identity));
ssl_handshake(&context);
#Note The only change to the server code is that I have changed the Preshared Key size to 16 from 32.
Also the configuration used for PolarSSL is below:
#define POLARSSL_AES_C
#define POLARSSL_CIPHER_C
#define POLARSSL_CTR_DRBG_C
#define POLARSSL_MD_C
#define POLARSSL_MD5_C
#define POLARSSL_SHA1_C
#define POLARSSL_SSL_CLI_C
#define POLARSSL_SSL_TLS_C
#define POLARSSL_PLATFORM_C
#define POLARSSL_PLATFORM_MEMORY
#define POLARSSL_CIPHER_MODE_CBC
#define POLARSSL_DEBUG_C
#define POLARSSL_BIGNUM_C
#define POLARSSL_AES_ROM_TABLES
#define POLARSSL_PSK_MAX_LEN 32

EXC_BAD_ACCESS, standard c library "open" on iphone?

Pretty straightforward question,
no matter what I do the app crashes on attempting to call open(), below is a part of the code that is relevant. filename is not a garbage value, and contains an absolute path to the file. This fails on the device and the simulator.
printf of filename returns:
/Users/programmingstation7/Library/Application Support/iPhone
Simulator/4.3/Applications/E2BD16DB-FFBA-45D2-B425-96C981380B85/Documents/issue2.zip
relevant backtrace:
#0 0x002132dc in open ()
#1 0x000ddcec in -[ExternalZipInstaller
unzipTheFile] (self=0x68a8d60, _cmd=0x1483f3) at
ExternalZipInstaller.mm:261
code:
#include <stdio.h> /* Standard input/output definitions */
#include <string.h> /* String function definitions */
#include <unistd.h> /* UNIX standard function definitions */
#include <fcntl.h> /* File control definitions */
#include <errno.h> /* Error number definitions */
#include <termios.h> /* POSIX terminal control definitions */
#ifndef O_BINARY
#define O_BINARY 0
#endif
- (void) unzipTheFile
{
BOOL success = YES;
const char* filename = [self.zipName UTF8String];
open(filename, O_RDONLY | O_BINARY);
The documentation for the UTF8String method of NSString has the following note:
The returned C string is automatically freed just as a returned object
would be released; you should copy the C string if it needs to store
it outside of the autorelease context in which the C string is
created.
I think you need to copy the resulting string into your own buffer instead of just pointing to it. The ObjC garbage collector could be deleting your string from under you. Try this instead:
const char filename[MAX_PATH];
strcpy(filename, [self.zipName UTF8String], MAX_PATH);
open(filename, O_RDONLY | O_BINARY);