I am using JWT for API authentication.
I am just curious to know how much expensive is to decrypt the JWT each time when a request arrives.
It depends on the algorithm(s) used.
(Note that JWT supports signing as well as encryption - signed JWTs are the more common use case; my answer is general.)
The symmetric key algorithms (AES, HMAC) are the least expensive (very fast).
For public key algorithms, RSA-based algorithms are the most expensive, and elliptic curve algorithms (ECDH for key encryption, ECDSA for signing) are less computationally expensive but still more expensive than symmetric algorithms.
Related
I need to hash identifiers before storing in a database. There will be up to 1 million values overall. I need to pseudonymise these values to comply with GDPR.
I am using .Net core and I want to stay with the core hashing functionality. I dont want to risk using external hashing implementations. The intention is to add a salt phrase to each value before hashing. These values have already been hashed by the supplier but I will be hashing again before storing in db.
I was going to use SHA256 but I have read that PBKDF2 is more secure. However, I have read that PBKDF2 is prone to collisions. It is of the utmost importance that the hashing implementation I use has a low collision chance. Has PBKDF2 a higher collision rate than simple SHA256? Does using a key-derivation of HMACSHA512 with PBKDF2 as opposed to HMACSHA1 reduce the possibility of collisions?
Would like recommendations for a secure, low-collision one-way hash for Net core.
There will be up to 1 million values overall. I need to pseudonymise these values to comply with GDPR
I was going to use SHA256 but I have read that PBKDF2 is more secure.
For this use case a proper cryptographic hash is imho the best option.
PBKDF2 is a key derivation function intended to derive higher entropy keys from relatively weak passwords. It uses a hash under the hood so if the hash has certain hash collision probability , pbkdf will have the same.
pbkdf2 is intended to be slow (using iterations) to mitigate feasibility of brute-forcing the input password. You don't need that property, even it may be bad for your use case.
So -you can boldly use sha256 to anonymize your data, imho it may be the best option you have today. Indeed principially you cannot prevent the hash collision, but the probability should be negligible
validation of HMAC generated hash requires the key to be known to the validator. So its symmetric. What is the similer asymmetric solution other than SSL ? as I want the signature be smaller like an md5 hash. and the generation and validation procedure to be light. I was going through Rabin's Signature algorithm but couldn't find any Implementation or pseudo code ro follow.
The smallest asymmetric signatures come from elliptic curve cryptosystems like ECDSA. ECDSA signature schemes require signatures approximately four times the length of a symmetric cipher key of equivalent security. So a scheme comparable in security to 128-bit AES would have 512-bit signatures. That's the state of the art right now -- schemes with smaller signatures but equal or greater security are not known.
If you don't need security quite that high, you could use a 192-bit curve which would result in 384-bit signatures. You can go down to 320-bit signatures (160-bit curves) and still have security comparable to 80-bit symmetric ciphers. If you really don't particularly care about security, 112-bit curves can be used, providing 224-bit signatures that are about as difficult to break as DES.
The following curves are what I would recommend for each security level:
SecP112R1: 224-bit signatures, 56-bit security level
SecP128R1: 256-bit signatures, 64-bit security level
SecP160K1: 320-bit signatures, 80-bit security level
SecP192K1: 386-bit signatures, 96-bit security level
SecP224K1: 448-bit signatures, 112-bit security level
SecP256K1: 512-bit signatures, 128-bit security level
For each curve, the private key is the same size as the curve. Public keys (in compressed form) are one bit larger than the curve size. Signatures are twice the curve size. So with SecP256K1, private keys are 256-bits, public keys are 257-bits, and signatures are 512-bits. These are the minimum sizes for the raw binary values.
Caution: I would consider 160-bit curves the minimum for any purpose where security is a factor. Smaller curves might be suitable if keys are generated, used, and then thrown away in a small time frame. For long-term security, 256-bit curves should be used. The system as a whole should be evaluated by competent experts before it is relied upon.
In order to integrity protect a byte stream one can conceptually either use symmetric cryptography (e.g. an HMAC with SHA-1) or asymmetric cryptography (e.g. digital signature with RSA).
It is common sense that asymmetric cryptography is much more expensive than using symmetric cryptography. However, I would like to have hard numbers and would like to know whether there exist benchmark suites for existing crypto libraries (e.g. openssl) in order to gain some measurement results for symmetric and asymmetric cryptography algorithms.
The numbers I get from the built-in "openssl speed" app can, unfortunately, not be compared to each other.
Perhaps somebody already implemented a small benchmarking suite for this purpose?
Thanks,
Martin
I don't think a benchmark is useful here, because the two things you're comparing are built for different use-cases. An HMAC is designed for situations in which you have a shared secret you can use to authenticate the message, whilst signatures are designed for situations in which you don't have a shared secret, but rather want anyone with your public key be able to verify your signature. There are very few situations in which either primitive would be equally appropriate, and when there is, there's likely to be a clear favorite on security, rather than performance grounds.
It's fairly trivial to demonstrate that an HMAC is going to be faster, however: Signing a message requires first hashing it, then computing the signature over the hash, whilst computing an HMAC requries first hashing it, then computing the HMAC (which is merely two additional one-block hash computations). For the same reason, though, for any reasonable assumption as to message length and speed of your cryptographic primitives, the speed difference is going to be negligible, since the largest part of the cost is shared between both operations.
In short, you shouldn't choose the structure of your cryptosystem based on insignificant differences in performance.
All digital signature algorithms (RSA, DSA, ECDSA...) begin by hashing the source stream with a hash function; only the hash output is used afterwards. So the asymptotic cost of signing a long stream of data is the same as the asymptotic cost of hashing the same stream. HMAC is similar in that respect: first you input in the hash function a small fixed-size header, then the data stream; and you have an extra hash operation at the end which operates on a small fixed-size input. So the asymptotic cost of HMACing a long stream of data is the same as the asymptotic cost of hashing the same stream.
To sum up, for a suitably long data stream, a digital signature and HMAC will have the same CPU cost. Speed difference will not be noticeable (the complex part at the end of a digital signature is more expensive than what HMAC does, but a simple PC will still be able to do it in less than a millisecond).
The hash function itself can make a difference, though, at least if you can obtain the data with a high bandwidth. On a typical PC, you can hope hashing data at up to about 300 MB/s with SHA-1, but "only" 150 MB/s with SHA-256. On the other hand, a good mechanical harddisk or gigabit ethernet will hardly go beyond 100 MB/s read speed, so SHA-256 would not be the bottleneck here.
I have been searching for a few days now, but I cannot find a big-O notation algorithm for encrypting, decrypting, or attempting to break an encrypted file (brute force) making use of public key encryption. I am attempting to determine the big-O notation of an idea I have developed that makes heavy use of public key encryption.
What are these Big-O algorithms as related to public key encryption:
A) Encrypt a file made up of N characters with an L length key
B) Decrypt that same file
C) A typical brute force algorithm to break an encrypted file with N characters and with a maximum key length of L
Any included Big-O notations for more efficient algorithms for breaking the encryption would be appreciated. Also, reference to wherever this material can be found.
Sorry to ask a question that I really should be able to find on my own, but I haven't managed to come across what I am looking for.
Standard public/private key algorithms are almost never used on large inputs, as the security properties of these algorithms are generally not suitable for bulk encryption. The most common configuration is to use a public/private key algorithm to encrypt a small (constant-size, usually 128 - 256 bit) key, then use that key for a symmetric encryption algorithm.
That being said, I'll use RSA as a test case for the rest of the questions:
A/B) Setting aside key generation, RSA encrypts and decrypts in O(n) for the size of the message. (Note that all messages must be the size of the key, so smaller messages are padded and larger messages must be broken up.) The exact speed of encryption/decryption depends on the algorithms used by your RSA implementation, but it's polynomial in key size:
http://www.javamex.com/tutorials/cryptography/rsa_key_length.shtml
C) Given a public key, RSA can be cracked by factoring the public key, which is currently best accomplished using GNFS (which is O(exp((7.1 b)^1/3 (log b)^1/3))). I don't believe there's much work on cracking RSA based on encrypted data, as the public key is a much more useful target.
I am reading http://www.definityhealth.com/marketing/how_ssl_works.html
Looks like SSL is using asymmetric algorithm to exchange the symmetric key, after that it uses symmetric algorithm to encrypt the data.
One question, can I use asymmetric algorithm only? Like Alice and Bob both have certificate and, they are all using peer's public key to encrypt the data.
No, you can't use only asymmetric encryption.
TLS (SSL) does not support encryption of application data with public key algorithms because it would make no sense: it would be much less efficient yet provide no improvement to security.
Public key encryption is not harder to break than symmetric algorithms. In fact, for all we know, there may a trick that makes breaking some asymmetric algorithms trivial, just waiting to be discovered.
Public key algorithm solve the key exchange problem, and that's how TLS and every other security protocol use them. Symmetric algorithms are used to keep data private and protect its integrity.
As a general rule, one can say that asymmetric algorithms are much more computing intensive than symmetric algorithms. Thus it is very common case to use an asymmetric algorithms to exchange a symmetric key that will be used to exchange the data. It is also considered as sufficiently safe security wise.
Can you use asymmetric algorithms for everything? Surely you can.
Can you do it within SSL? I don't know.
Yes, you can, if you provide your own implementation for SSL - as this is not the original SSL design. (BTW, use TLS - it is very similiar but more secure).
Symmetric key uses the same key to encrypt and decrypt the data. The biggest issue with it is to send these to the receiver. Therefore the use of asymmetric keys are encouraged, where they have private and public keys.
Symmetric keys are generally used to encrypt large amounts of data which is faster. After, we send this data to the receiver again using an asymmetric algorithm.