How to obfuscate an iPhone app before being published into iTunes - objective-c

I am using cryptography concepts in my application. Some Encryption and decryption techniques and salt values are hard coded and IV values are using those techniques. Before I publish the app I need to obfuscate to my code. I am using XCode 6.2 version with updated OS. Please share with me techniques about how to do it.

Neither "decryption techniques " nor iv need to be kept secret, only the encryption key. Good security requires using proven methods and cryptographic primitives not secret methods. Developer created cryptographic methods are usually insecure and lack peer review for flaws, the standard methods have been well researched and vetted for security flaws.
Shipping with the key embedded in the code is a problem, the key should be randomly generated (or in some similar process) at first-run and stored in the Keychain.

Related

Insufficient Transport Layer Protection iOS

We are developing an application in react-native. Our security team has raised a vulnerability in iOS binary. The description of the vulnerability is given below.
The application has references to potentially risky symbols that
modify the default SSL certificate validation.
The behavior of iOS's TLS/SSL libraries' certificate validation is
intended to be secure by default (CFNetwork, Foundation, etc). These
libraries validate a number of security items such as certificate
expiration dates and known root certificates.
This application was found to reference symbols that can be used to
modify the default validation behavior for TLS/SSL certificates. When
the default validation is modified the potential exists to
inadvertently weaken the security of the TLS/SSL protocol and thereby
making the traffic more susceptible to interception or modification by
attackers. The binary contains references to methods that may result
in vulnerable SSL connections:
_kCFStreamSSLValidatesCertificateChain
We have not implemented SSL pinning in the app. We are not sure how to fix this issue and the security team has not been able to provide any additional information. They use some automated tool which identified the issue and provides only following recommendation which we could not find very helpful in remediating the vulnerability.
Review the application's and third party library's code for the use of
these symbols to ensure they are not being used to weaken the TLS/SSL
security. Some of the symbols detected may be deprecated or involve
non-public APIs so a review of the code is encouraged for compliance
with Apple's App Store policies.
Please note that the presence of these symbols does not indicate
insecure usage as they may be used to check for or explicitly set
default values. These symbols may also be found in code designed to
perform non-standard certificate validation, such as custom SSL
pinning implementations.
Any help provided by the community in identifying and fixing this issue would be really appreciated.

Signing pdf signature field with hashed cert in itextsharp

I have a hash of a certificate using the example at http://techblog.bozho.net/?p=37 and wish to use this to fill in a signature field in a pdf form with iTextSharp. Does anyone have any examples or know how? This is a web app and the only method I can find for capturing the signature from a smart card local to the client.
Ken
You can't use a hash of the certificate for signing. Signing is performed using the private key (and not its hash either).
For web application you have two options - either transfer the whole document to the client, or use some distributed signing mechanism which involves a client-side module (either a browser plugin / java applet or a standalone application).
Our company developed a distributed cryptography add-on to SecureBlackbox, which is described in details in this SO answer and which does what you need. The scheme in the answer explains how distributed signing would work.

Adding OpenSSL into existing app

I'm adding SSL support (currently pushing forward with OpenSSL) to an existing application. I've never done anything with cryptology before and after reading numerous articles and watching videos I'm still a little confused as to how I can implement it for my specific situation.
Our software is client/server, and the end-user purchases both and will install it on their premises.
My first bit of confusion is regarding certificates and private keys, and how to manage these. Should I have one certificate that get installed along with the app? Should each end-user have their own certificate generated? What about private keys? Do I bake the private key into the server binary? Or should there be a file with the private key?
I'm sure this is a solved problem, but I'm not quite sure where to look, or what to search for.
Thanks for any help and advice.
Adding OpenSSL into existing app
If all you need is an example of a SSL/TLS client, have a look at the OpenSSL's wiki and TLS Client example.
My first bit of confusion is regarding certificates and private keys, and how to manage these.
Yes, key management and distribution is the hardest problem in crpyto.
Public CAs have legally binding documents covering these practices. They are called Certification Practice Statements (CPS). You can have a lot of fun with them because the company lawyers tell you what you don't want to hear (or the marketing department refuses to tell you).
For example, here's an excerpt from Apple Inc. Certification Authority Certification Practice Statement:
2.4.2. CA disclaimers of warranties
To the extent permitted by applicable law, Subscriber agreements,
if applicable, disclaim warranties from Apple, including any
warranty of merchantability or fitness for a particular purpose.
2.4.3. CA limitations of liability
To the extent permitted by applicable law, Subscriber agreements,
if applicable, shall limit liability on the part of Apple and shall
exclude liability for indirect, special, incidental, and
consequential damages.
So, Apple is selling you a product with no warranty and that offers no liability!!! And they want you to trust them and give them money... what a racket! And its not just Apple - other CAs have equally obscene CPS'es.
Should I have one certificate that get installed along with the app?
It depends. If you are running your own PKI (i.e., you are the CA and control the root certificate), then distribute your root X509 certificate with you application and nothing else. There's no need to trust any other CAs, like Verisign or Startcom.
If you are using someone else's PKI (or the Internet's PKI specified in RFC 5280), then distribute only the root X509 certificate needed to validate the chain. In this case, you will distribute one CA's root X509 certificate for validation. You could potentially trust just about any certificate signed by that particular CA, however (and its likely to be in the 10's of thousands if you are not careful).
If you don't know in advance, then you have to do like browsers and pick a bunch of CAs to trust and carry around their root certificates for your application. You can grab a list of them from Mozilla, for example. You could potentially trust just about any certificate signed by all CAs, however (and its likely to be in the 10's of millions if you are not careful).
There's a lot more to using public CAs like browsers, and you should read through Peter Gutmann's Engineering Security. Pay particular attention to the Security Diversification strategies.
When the client connects to your server, your server should send its X509 certificate (the leaf certificate) and any intermediate certificates required to build a valid chain back to the root certificate you distribute.
Finally, you can get free SSL/TLS certificates trusted by most major browsers (including mobile) from Eddy Nigg at Startcom. He charges for the revocation (if needed) because that's where the cost lies. Other CAs charge you up front and pocket the proceeds if not needed.
Should each end-user have their own certificate generated?
That is possible, too. That's called client certificates or client authentication authentication. Ideally, you would be running your own PKI because (1) you control everything (including the CA operations) and don't need to trust anyone outside the organization; and (2) it can get expensive to have a commercial CA sign every user's certificate.
If you don't want to use client side certificates, please look into PSK (Preshared Keys) and SRP (Secure Remote Password). Both beat the snot out of classic X509 using RSA key transport. PSK and SRP do so because they provide mutual authentication and channel binding. In these systems, both the client and server know the secret or password and the channel is setup up; or one (or both) does not know and channel setup fails. The plain text username and password are never put on the wire as in RSA transport and basic_auth schemes. (I prefer SRP because its based on Diffie-Hellman, and have implemented it in a few systems).
What about private keys?
Yes, you need to manage the private keys associated with certificates. You can (1) store them in the filesystem with permissions or ACLs; (2) store them in a Keystore or Keychain like Android, Mac OS X, iOS, or Windows; (3) store them in an Hardware Security Module (HSM); or (4) store them remotely while keeping them online using Key Management Interop Protocol (KMIP).
Note: unattended key storage on a server is a problem without a solution. See, for example, Peter Gutmann's Engineering Security, page 368 under "Wicked Hard Problems" and "Problems without Solutions".
Do I bake the private key into the server binary?
No. You generate them when needed and then store them with the best protection you can provide.
Or should there be a file with the private key?
Yes, something like that. See above.
I'm sure this is a solved problem, but I'm not quite sure where to look, or what to search for.
I'm not sure I would really call it solved because of the key distribution problem.
And some implementations are just really bad, so you would likely wonder how the code passed for production.
The first thing you probably want (since your focusing on key management) is a treatment of "key management" and "key hierarchies".
You might also want some reference material. From the security engineering point of view, read Gutmann's Engineering Security and Ross Anderson's Security Engineering. From an implementation standpoint, grab a copy of Network Security with OpenSSL and SSL and TLS: Designing and Building Secure Systems.

Does using SSL mean you have to say your app uses Cryptography?

I am almost ready to submit a Windows 8 Store app to the store. As part of this process you must answer the question:
Does your app call, support, contain, or use cryptography or encryption?
It goes on to mention these possibilities:
Any use of a digital signature, such as authentication or integrity checking
Encryption of any data or files that your app uses or accesses
Key management, certificate management, or anything that interacts with a public key infrastructure
Using a secure communication channel such as NTLM, Kerberos, Secure Sockets Layer (SSL), or Transport Layer Security (TLS)
Encrypting passwords or other forms of information security
Copy protection or digital rights management (DRM)
Antivirus protection
(emphasis mine.) There are some exemptions:
Password encryption
Copy protection
Authentication
Digital rights management
Using digital signatures
My app was originally a Windows Phone app with limited ability to store or export data locally, so we have functionality to backup to or restore from SkyDrive. (For the purposes of this question the fact that SkyDrive may soon change its name is not relevant.) We put this same capability into the Windows Store app. The connection to SkyDrive is https - in other words we are using SSL.
Does this mean I need an Export Commodity Classification Number (ECCN)? Really?
From this page, Understanding export restrictions on cryptography, it looks like the answer is yes, SSL counts unless you are not transporting content over the wire. But I'm not a lawyer.
Does your app call, support, contain, or use cryptography or encryption?
This question helps you determine if your app uses a type of cryptography that is governed by the Export Administration Regulations. The question includes the examples shown in the list here; but remember that this list doesn't include every possible application of cryptography.
Important When you answer this question, consider not only the code you wrote for your app, but also all the software libraries, utilities and operating system components that your app includes or links to.
Any use of a digital signature, such as authentication or integrity checking
Encryption of any data or files that your app uses or accesses
Key management, certificate management, or anything that interacts with a public key infrastructure
Using a secure communication channel such as NTLM, Kerberos, Secure Sockets Layer (SSL), or Transport Layer Security (TLS)
Encrypting passwords or other forms of information security
Copy protection or digital rights management (DRM)
Antivirus protection
For the complete and current list of cryptographic applications, see EAR Controls for Items That Use Encryption.
Is the cryptography or encryption limited to one or more of the tasks listed here?
If you answered yes to the first question, then the second question lists some of the applications of cryptography that are not restricted. Here are the unrestricted tasks:
Password encryption
Copy protection
Authentication
Digital rights management
Using digital signatures
If your app calls, supports, contains, or uses cryptography or encryption for any task that is not in this list then your answer to this question is No.

How to protect API Key in Flex/AIR from decompiling?

No obfuscation please and simpler the better.
Similar post is Shared secret with API in an Ajax Adobe AIR app but I was not convinced that these protect from decompiling. If they do, please explain (For example, what's stopping someone from decompiling and using the URLLoader themselves).
If the public key is in your code, there is nothing that can ever stop anyone from decompiling your app and getting the key.
Also - if the key is sent unencrypted from the AIR app to the server, it is a piece of cake monitoring the net traffic and retrieving the key from there. So even if you protect the key by storing it encrypted, you're pretty much screwed.
If you want to protect it, you have to send your calls through a proxy server that you control and keep the key there.