AES encryption and decryption resulting in file different than original - wcf

I've decided to implement encryption for file transfers in my service. File transfers prior to this were not encrypted, and they were sent and received flawlessly with the exact same number of bytes.
Now I've introduced asymmetrical and symmetrical encryption into the mix to encrypt the data as it passes over the TCP protocol. I use asymmetrical to do an initial handshake passing the symmetrical key to the other party encrypted by the asymmetric public key. From then on out, the receiver of the file calls the sender periodically, and the sender generates a new initialization vector, encrypts the data with the symmetric key, and sends it over to be decrypted by the receiver using the IV and same symmetric key.
The chunk size I'm using is 2mb, such that the byte size of the generated chunks, with exception to the last chunk which varies, is 2097152. When AES encrypts this file with PaddingMode.PKCS7 and CipherMode.CBC, the resulting byte size is 2097168. It's gained about 16 bytes during the encryption process.
Now initially I thought this is where my problem was, but when I decrypt the data on the receiving end, it goes back to the 2097152 byte length and I write it to the file. I've proven to myself that it does indeed encrypt and decrypt the data.
On a small enough file, the file sizes from the original to the sender seem to be exactly the same. However, as I step up to larger file sizes, there exists a descrepency. On a video file(Wildlife.wmv from windows 7 install) of size 26,246,026 bytes, I am instead receiving a finished transfer that is of 26,246,218 bytes.
Why is there this size difference? What am I doing wrong here?
Here's some of my code.
For my encryption I am using the following class to encrypt or decrypt, returning a result in the form of a byte array.
public class AesCryptor
{
public byte[] Encrypt(byte[] data, byte[] key, byte[] iv)
{
using (SymmetricAlgorithm aes = new AesManaged())
{
aes.Key = key;
aes.IV = iv;
aes.Padding = PaddingMode.PKCS7;
aes.Mode = CipherMode.CBC;
using (ICryptoTransform encryptor = aes.CreateEncryptor(key, iv))
{
return Crypt(data, key, iv, encryptor);
}
}
}
public byte[] Decrypt(byte[] data, byte[] key, byte[] iv)
{
using (SymmetricAlgorithm aes = new AesManaged())
{
aes.Key = key;
aes.IV = iv;
aes.Padding = PaddingMode.PKCS7;
aes.Mode = CipherMode.CBC;
using (ICryptoTransform decryptor = aes.CreateDecryptor(key, iv))
{
return Crypt(data, key, iv, decryptor);
}
}
}
private byte[] Crypt(byte[] data, byte[] key, byte[] iv, ICryptoTransform transform)
{
using (MemoryStream memoryStream = new MemoryStream())
{
using (CryptoStream cryptoStream = new CryptoStream(memoryStream, transform, CryptoStreamMode.Write))
{
cryptoStream.Write(data, 0, data.Length);
cryptoStream.FlushFinalBlock();
}
return memoryStream.ToArray();
}
}
}
The sender of the file is encrypting the data(after the handshake of the private symmetric key) with this code(and a lot more that doesn't pertain to the actual encryption process. Note the chunkedFile.NextChunk(). This calls a method on the class that is doing the file chunking for me, returning 2mb chunk sizes unless the final size is smaller.
byte[] buffer;
byte[] iv = new byte[symmetricEncryptionBitSize / 8];
using (var rngCrypto = new RNGCryptoServiceProvider())
rngCrypto.GetBytes(iv);
AesCryptor cryptor = new AesCryptor();
buffer = cryptor.Encrypt(chunkedFile.NextChunk(), symmetricPrivateKey, iv);
The code below is what the receiver of the file uses(not all of it, this is what pertains to the decrypting of the data). The data is being written to a file stream(writer).
FileMessage message = hostChannel.ReceiveFile();
moreChunks = message.FileMetaData.MoreChunks;
UpdateTotalBytesTransferred(message);
writer.BaseStream.Position = filePosition;
byte[] decryptedStream;
// Copy the message stream out to a memory stream so we can work on it afterwards.
using (var memoryStream = new MemoryStream())
{
message.ChunkData.CopyTo(memoryStream);
decryptedStream = cryptor.Decrypt(memoryStream.ToArray(), symmetricPrivateKey, message.FileMetaData.InitializationVector);
}
writer.Write(decryptedStream);
By the way, in case it is needed, NextChunk is a very simple method.
public byte[] NextChunk()
{
if (MoreChunks) // If there are more chunks, procede with the next chunking operation, otherwise throw an exception.
{
byte[] buffer;
using (BinaryReader reader = new BinaryReader(File.OpenRead(FilePath)))
{
reader.BaseStream.Position = CurrentPosition;
buffer = reader.ReadBytes((int)MaximumChunkSize);
}
CurrentPosition += buffer.LongLength; // Sets the stream position to be used for the next call.
return buffer;
}
else
throw new InvalidOperationException("The last chunk of the file has already been returned.");
}
EDIT: It seems that for every chunk transferred, and thus every encryption, I am gaining 16bytes in file size. This does not happen with extremely small file sizes.

Well I solved the issue.
It turns out I was sending in the message data the chunkLength of the encrypted chunk data. So for every chunk I sent, even though I decrypted and wrote the correct filedata, I was advancing the stream position by the length of the encrypted data. This means every time I decrypted, when transferring more than 1 chunk(this is why the small files of only 1 chunk size didn't have problems) I was adding 16 bytes to the file size.
People helping me probably wouldn't have been able to figure this out, because I didn't include all of the data in the client side or the server side to see this. But thankfully I managed to answer it myself.
On the sender side, I was creating my FileMessage like this.
FileMessage message = new FileMessage();
message.FileMetaData = new FileMetaData(chunkedFile.MoreChunks, chunkedFile.ChunkLength, chunkedFile.CurrentPosition, iv);
message.ChunkData = new MemoryStream(buffer);
If you see the second parameter of FileMetaData constructor, I'm passing in chunkedFile.ChunkLength which is supposed to be the length of the chunk. I was doing this on the encrypted chunk data, which resulted in sending the incorrect chunk length.
The client on the other hand, was receiving this extra information. If you look near the end, you'll see the code filePosition += message.FileMetaData.ChunkLength;. I was using that erroneous chunkLength to advance the file position. It turns out that setting of the streamPosition was not even necessary.
using (BinaryWriter writer = new BinaryWriter(File.OpenWrite(fileWritePath)))
{
writer.BaseStream.SetLength(0);
while (moreChunks)
{
FileMessage message = hostChannel.ReceiveFile();
moreChunks = message.FileMetaData.MoreChunks;
UpdateTotalBytesTransferred(message);
writer.BaseStream.Position = filePosition;
byte[] decryptedStream;
// Copy the message stream out to a memory stream so we can work on it afterwards.
using (var memoryStream = new MemoryStream())
{
message.ChunkData.CopyTo(memoryStream);
Debug.WriteLine("Received Encrypted buffer Length: " + memoryStream.Length);
decryptedStream = cryptor.Decrypt(memoryStream.ToArray(), symmetricPrivateKey, message.FileMetaData.InitializationVector);
Debug.WriteLine("Received Decrypted buffer Length: " + decryptedStream.Length);
}
writer.Write(decryptedStream);
TotalBytesTransferred = message.FileMetaData.FilePosition;
filePosition += message.FileMetaData.ChunkLength;
}
OnTransferComplete(this, EventArgs.Empty);
StopSession();
}
Such a simple bug, but one that wasn't leaping out at me quickly at all.

Related

itextsharp signing pdf with signed hash

I'm trying to sign a pdf through a signing service. This service requires to send a hex encoded SHA256 digest and in return I receive a hex encoded signatureValue. Besides that I also receive a signing certificate, intermediate certificate, OCSP response, and TimeStampToken. However, I already get stuck trying to sign the pdf with the signatureValue.
I have read Bruno's white paper, browsed the internet excessively, and tried many different ways, but the signature keeps coming up as invalid.
My latest attempt:
First, prepare pdf
PdfReader reader = new PdfReader(src);
FileStream os = new FileStream(dest, FileMode.Create);
PdfStamper stamper = PdfStamper.CreateSignature(reader, os, '\0');
PdfSignatureAppearance appearance = stamper.SignatureAppearance;
appearance.Certificate = signingCertificate;
IExternalSignatureContainer external = new ExternalBlankSignatureContainer(PdfName.ADOBE_PPKLITE, PdfName.ADBE_PKCS7_DETACHED);
MakeSignature.SignExternalContainer(appearance, external, 8192);
string hashAlgorithm = "SHA-256";
PdfPKCS7 sgn = new PdfPKCS7(null, chain, hashAlgorithm, false);
PdfSignatureAppearance appearance2 = stamper.SignatureAppearance;
Stream stream = appearance2.GetRangeStream();
byte[] hash = DigestAlgorithms.Digest(stream, hashAlgorithm);
byte[] sh = sgn.getAuthenticatedAttributeBytes(hash, null, null, CryptoStandard.CMS);
Hash byte[] sh and convert to string as follows
private static String sha256_hash(Byte[] value)
{
using (SHA256 hash = SHA256.Create())
{
return String.Concat(hash.ComputeHash(value).Select(item => item.ToString("x2"))).ToUpper();
}
}
and send to signing service. The received hex encoded signatureValue I then convert to bytes
private static byte[] StringToByteArray(string hex)
{
return Enumerable.Range(0, hex.Length).Where(x => x % 2 == 0).Select(x => Convert.ToByte(hex.Substring(x, 2), 16)).ToArray();
}
Finally, create signature
private void CreateSignature(string src, string dest, byte[] sig)
{
PdfReader reader = new PdfReader(src); // src is now prepared pdf
FileStream os = new FileStream(dest, FileMode.Create);
IExternalSignatureContainer external = new MyExternalSignatureContainer(sig);
MakeSignature.SignDeferred(reader, "Signature1", os, external);
reader.Close();
os.Close();
}
private class MyExternalSignatureContainer : IExternalSignatureContainer
{
protected byte[] sig;
public MyExternalSignatureContainer(byte[] sig)
{
this.sig = sig;
}
public byte[] Sign(Stream s)
{
return sig;
}
public void ModifySigningDictionary(PdfDictionary signDic) { }
}
What am I doing wrong? Help is very much appreciated. Thanks!
Edit: Current state
Thanks to help from mkl and following Bruno's deferred signing example I've gotten past the invalid signature message. Apparently I don't receive a full chain from the signing service, but just an intermediate certificate, which caused the invalid message. Unfortunately, the signature still has flaws.
I build the chain like this:
List<X509Certificate> certificateChain = new List<X509Certificate>
{
signingCertificate,
intermediateCertificate
};
In the sign method of MyExternalSignatureContainer I now construct and return the signature container:
public byte[] Sign(Stream s)
{
string hashAlgorithm = "SHA-256";
PdfPKCS7 sgn = new PdfPKCS7(null, chain, hashAlgorithm, false);
byte[] ocspResponse = Convert.FromBase64String("Base64 encoded DER representation of the OCSP response received from signing service");
byte[] hash = DigestAlgorithms.Digest(s, hashAlgorithm);
byte[] sh = sgn.getAuthenticatedAttributeBytes(hash, ocspResponse, null, CryptoStandard.CMS);
string messageDigest = Sha256_hash(sh);
// messageDigest sent to signing service
byte[] signatureAsByte = StringToByteArray("Hex encoded SignatureValue received from signing service");
sgn.SetExternalDigest(signatureAsByte, null, "RSA");
ITSAClient tsaClient = new MyITSAClient();
return sgn.GetEncodedPKCS7(hash, tsaClient, ocspResponse, null, CryptoStandard.CMS);
}
public class MyITSAClient : ITSAClient
{
public int GetTokenSizeEstimate()
{
return 0;
}
public IDigest GetMessageDigest()
{
return new Sha256Digest();
}
public byte[] GetTimeStampToken(byte[] imprint)
{
string hashedImprint = HexEncode(imprint);
// Hex encoded Imprint sent to signing service
return Convert.FromBase64String("Base64 encoded DER representation of TimeStampToken received from signing service");
}
}
Still get these messages:
"The signer's identity is unknown because it has not been included in the list of trusted identities and none or its parent
certificates are trusted identities"
"The signature is timestamped, but the timestamp could not be verified"
Further help is very much appreciated again!
"What am I doing wrong?"
The problem is that on one hand you start constructing a CMS signature container using a PdfPKCS7 instance
PdfPKCS7 sgn = new PdfPKCS7(null, chain, hashAlgorithm, false);
and for the calculated document digest hash retrieve the signed attributes to be
byte[] sh = sgn.getAuthenticatedAttributeBytes(hash, null, null, CryptoStandard.CMS);
to send them for signing.
So far so good.
But then you ignore the CMS container you started constructing but instead inject the naked signature bytes you got from your service into the PDF.
This cannot work as your signature bytes don't sign the document directly but instead they sign these signed attributes (and, therefore, indirectly the document as the document hash is one of the signed attributes). Thus, by ignoring the CMS container under construction you dropped the actually signed data...
Furthermore, the subfilter ADBE_PKCS7_DETACHED you use promises that the embedded signature is a full CMS signature container, not a few naked signature bytes, so the format also is wrong.
How to do it instead?
Instead of injecting the naked signature bytes you got from your service into the PDF as is, you have to set them as external digest in the PdfPKCS7 instance in which you originally started constructing the signature container:
sgn.SetExternalDigest(sig, null, ENCRYPTION_ALGO);
(ENCRYPTION_ALGO must be the encryption part of the signature algorithm, I assume in your case "RSA".)
and then you can retrieve the generated CMS signature container:
byte[] encodedSig = sgn.GetEncodedPKCS7(hash, null, null, null, CryptoStandard.CMS);
Now this is the signature container to inject into the document using MyExternalSignatureContainer:
IExternalSignatureContainer external = new MyExternalSignatureContainer(encodedSig);
MakeSignature.SignDeferred(reader, "Signature1", os, external);
Remaining issues
Having corrected your code Adobe Reader still warns about your signatures:
"The signer's identity is unknown because it has not been included in the list of trusted identities and none or its parent certificates are trusted identities"
This warning is to be expected and correct!
The signer's identity is unknown because your signature service uses merely a demo certificate, not a certificate for production use:
As you see the certificate is issued by "GlobalSign Non-Public HVCA Demo", and non-public demo issuers for obvious reasons must not be trusted (unless you manually add them to your trust store for testing purposes).
"The signature is timestamped, but the timestamp could not be verified"
There are two reasons why Adobe does not approve of your timestamp:
On one hand, just like above, the timestamp certificate is a non-public, demo certificate ("DSS Non-Public Demo TSA Responder"). Thus, there is no reason for the verifier to trust your timestamp.
On the other hand, though, there is an actual error in your timestamp'ing code, you apply the hashing algorithm twice! In your MyITSAClient class you have
public byte[] GetTimeStampToken(byte[] imprint)
{
string hashedImprint = Sha256_hash(imprint);
// hashedImprint sent to signing service
return Convert.FromBase64String("Base64 encoded DER representation of TimeStampToken received from signing service");
}
The imprint parameter of your GetTimeStampToken implementation is already hashed, so you have to hex encode these bytes and send them for timestamp'ing. But you apply your method Sha256_hash which first hashes and then hex encodes this new hash.
Thus, instead of applying Sha256_hash merely hex encode the imprint!

In Itext 7, how to sign a pdf with 2 steps?

Following the answers given in this previous question : In Itext 7, how to get the range stream to sign a pdf?, i've tried to reimplement the two steps signing method working in Itext 5 but i encounter an issue when trying to reopen the document result of the first step (with the PdfReader or a pdf reader).(invalid document)
Here is the presigning part for a document already containing an empty signature field named certification ... why is the result of this step invalid ?
PdfReader reader = new PdfReader(fis);
Path signfile = Files.createTempFile("sign", ".pdf");
FileOutputStream os = new FileOutputStream(signfile.toFile());
PdfSigner signer = new PdfSigner(reader, os, false);
signer.setFieldName("certification"); // this field already exists
signer.setCertificationLevel(PdfSigner.CERTIFIED_FORM_FILLING);
PdfSignatureAppearance sap = signer.getSignatureAppearance();
sap.setReason("Certification of the document");
sap.setLocation("On server");
sap.setCertificate(maincertificate);
BouncyCastleDigest digest = new BouncyCastleDigest();
PdfPKCS7 sgn = new PdfPKCS7(null, chain, hashAlgorithm, null, digest,false);
//IExternalSignatureContainer like BlankContainer
PreSignatureContainer external = new PreSignatureContainer(PdfName.Adobe_PPKLite,PdfName.Adbe_pkcs7_detached);
signer.signExternalContainer(external, 8192);
byte[] hash=external.getHash();
byte[] sh = sgn.getAuthenticatedAttributeBytes(hash, null, null,PdfSigner.CryptoStandard.CMS);// sh will be sent for signature
And here is the PreSignatureContainer class :
public class PreSignatureContainer implements IExternalSignatureContainer {
private PdfDictionary sigDic;
private byte hash[];
public PreSignatureContainer(PdfName filter, PdfName subFilter) {
sigDic = new PdfDictionary();
sigDic.put(PdfName.Filter, filter);
sigDic.put(PdfName.SubFilter, subFilter);
}
#Override
public byte[] sign(InputStream data) throws GeneralSecurityException {
String hashAlgorithm = "SHA256";
BouncyCastleDigest digest = new BouncyCastleDigest();
try {
this.hash= DigestAlgorithms.digest(data, digest.getMessageDigest(hashAlgorithm));
} catch (IOException e) {
throw new GeneralSecurityException("PreSignatureContainer signing exception",e);
}
return new byte[0];
}
#Override
public void modifySigningDictionary(PdfDictionary signDic) {
signDic.putAll(sigDic);
}
public byte[] getHash() {
return hash;
}
public void setHash(byte hash[]) {
this.hash = hash;
}
}
why is the result of this step invalid
Because you essentially discovered a bug... ;)
Your sample input file has one feature which triggers the bug: It is compressed using object streams.
When iText manipulates such a file, it also tries to put as many objects as possible into object streams. Unfortunately it also does so with the signature dictionary. This is unfortunate because after writing the whole file it tries to enter some information (which are not available before) into this dictionary which damages the compressed object stream.
What you can do...
You can either
wait for iText development to fix this issue - I assume this won't take too long but probably you don't have the time to wait; or
convert the file to sign into a form which does not use object streams - this can be done using iText itself but probably you cannot accept the file growth this means, or probably the files already are signed which forbids any such transformation; or
patch iText 7 to force the signature dictionary not to be added to an object stream - it is a trivial patch but you probably don't want to used patched libraries.
The patch mentioned above indeed is trivial, the method PdfSigner.preClose(Map<PdfName, Integer>) contains this code:
if (certificationLevel > 0) {
// add DocMDP entry to root
PdfDictionary docmdp = new PdfDictionary();
docmdp.put(PdfName.DocMDP, cryptoDictionary.getPdfObject());
document.getCatalog().put(PdfName.Perms, docmdp); // TODO: setModified?
}
document.close();
The cryptoDictionary.getPdfObject()) is the signature dictionary I mentioned above. During document.close() it is added to an object stream unless it has been written to the output before. Thus, you simply have to add a call to flush that object right before that close call and by parameter make clear that it shall not be added to an object stream:
cryptoDictionary.getPdfObject().flush(false);
With that patch in place, the PDFs your code returns are not damaged as above anymore.
As an aside, iText 5 does contain a similar line in the corresponding PdfSignatureAppearance.preClose(HashMap<PdfName, Integer>) right above the if block corresponding to the if block above. It seems to have been lost during refactoring to iText 7.

Stripes Framework, Filebean and File object

What would be the best way to access the File object that is contained in the FileBean in Stripes? I am trying to store the file in Amazon's S3 and it requires a byte array. Seems simple enough if I can get to the File object.
FileBean has a getInputStream() method which allows to read every byte from the FileBean. If you really want to store everything in memory in a byte array (which is a bad idea, especially if files can be large), then read evrything from the stream and write it to a ByteArrayOutputStream:
byte[] buffer = new byte[1024];
InputStream in = fileBean.getInputStream();
ByteArrayOutputStream out = new ByteArrayOutputStream();
int read;
while ((read = in.read(buffer)) >= 0) {
out.write(buffer, 0, read);
}
byte[] contentAsByteArray = out.toByteArray();

Windows 8 Metro RSA Encryption: AsymmetricKeyAlgorithmProvider ImportPublicKey Fails

I am attempting to pass some encrypted data between a Win 8 Metro app and a RESTful WCF service. Initially the Metro app requests a public key and the WCF service returns it as a raw Stream as to avoid any pesky formatting issues. The Base 64 encoded public key is decoded in the metro app into a byte array. Here is where the problem occurs. When I attempted to call AsymmetricKeyAlgorithmProvider.ImportPublicKey I get the error "ASN1 bad tag value met".
I am using RSA PKCS1 for the encryption. Here is the relevant code:
WCF Service
string keyName = "This is passed in via a parameter";
var key = !CngKey.Exists(keyName) ? CngKey.Create(CngAlgorithm2.Rsa, keyName) : CngKey.Open(keyName);
// Create the RSA container to get keys and then dispose
using (var rsaCng = new RSACng(key) { EncryptionPaddingMode = AsymmetricPaddingMode.Pkcs1, KeySize = 2048 })
{
byte[] publicBlob = rsaCng.Key.Export(CngKeyBlobFormat.GenericPublicBlob);
publicKey = Convert.ToBase64String(publicBlob);
}
Metro App
public static string Encrypt(IBuffer dataBuffer, string publicKeyString)
{
var asymmAlg = AsymmetricKeyAlgorithmProvider.OpenAlgorithm(AsymmetricAlgorithmNames.RsaPkcs1);
// The next line fails with ASN1 bad tag value met
var publicKey = asymmAlg.ImportPublicKey(CryptographicBuffer.DecodeFromBase64String(publicKeyString), CryptographicPublicKeyBlobType.Pkcs1RsaPublicKey);
var encryptedData = CryptographicEngine.Encrypt(publicKey, dataBuffer, null);
return CryptographicBuffer.EncodeToBase64String(encryptedData);
}
EDIT 1: More information below
Exporting the public key from a 2048bit key pair from the WCF service yields a 283 bit length key blob, while exporting the same type of public key from the Metro app is only 270 bits. When I import the Metro generated public key it succeeds. Any idea why the WCF service has 13 extra bits on its public key? I think those extra 13 bits are causing the failure.
Here is the Metro code that yields the shorter public key blob:
var provider = AsymmetricKeyAlgorithmProvider.OpenAlgorithm(AsymmetricAlgorithmNames.RsaPkcs1);
CryptographicKey standardKeyPair = provider.CreateKeyPair(2048);
byte[] standardKey = standardKeyPair.ExportPublicKey(CryptographicPublicKeyBlobType.Pkcs1RsaPublicKey).ToArray();
Quite late, but maybe it will help you or saves someone's time...
Change the type of blob type during import. It's really wierd, but I had success with it, after experimenting.
Your code in WCF may stay as it is.
Change just the Metro code:
public static string Encrypt(IBuffer dataBuffer, string publicKeyString)
{
var asymmAlg = AsymmetricKeyAlgorithmProvider.OpenAlgorithm(AsymmetricAlgorithmNames.RsaPkcs1);
// The next line fails with ASN1 bad tag value met
var publicKey = asymmAlg.ImportPublicKey(CryptographicBuffer.DecodeFromBase64String(publicKeyString), CryptographicPublicKeyBlobType.BCryptPublicKey);
var encryptedData = CryptographicEngine.Encrypt(publicKey, dataBuffer, null);
return CryptographicBuffer.EncodeToBase64String(encryptedData);
}
So the only change here is the BCryptPublicKey during the importing. Then it works. But do not ask me why :-).

Questions using Base64 and remote querys into remote Database.. (high waiting time with bigger images???)

i have a image of 5Kb, when i transform it into Base64 string and i upload to my remote database, the remote INSERT query needs only a few secs
but.. i have a image of 100Kb, when i transform it into Base64 string and i upload to my remote database, the remote INSERT query needs a lot of seconds to be executed
why?
it is because the Base64 String needs 100KB of space like the non encoded image?
there is a way to solve these waiting times?
MORE INFO: im using PHP+JSOn to connect to mysql remote database.
Oded sugested me to not using Base64 and to use BLOB and not LONGTEXT. But.... ¿how to use BLOB with JSON+PHP? i dont know it as i know, JSON+PHP needs to receive and send Strings, and BLOB is not a String
thanks
EDIT 2:
this is the code where it takes a looot time waiting (it waits in the line: while ((line = reader.readLine()) != null) { , it is waiting on reader.readLine() )
this code gets one user from the remote database, it takes a loooooot of time to show the user on my app
public Friend RetrieveOneUser(String email)
{
Friend friend=null;
String result = "";
//the parameter data to send
ArrayList<NameValuePair> nameValuePairs = new ArrayList<NameValuePair>();
nameValuePairs.add(new BasicNameValuePair("email",email));
//http post
InputStream is=null;
try{
HttpClient httpclient = new DefaultHttpClient();
HttpPost httppost = new HttpPost(this.BaseURL + this.GetOneUser_URL);
httppost.setEntity(new UrlEncodedFormEntity(nameValuePairs));
HttpResponse response = httpclient.execute(httppost);
HttpEntity entity = response.getEntity();
is = entity.getContent();
}catch(Exception e){
Log.e("log_tag", "Error in http connection "+e.toString());
}
//convert response to string
try{
BufferedReader reader = new BufferedReader(new InputStreamReader(is,"iso-8859-1"),8);
StringBuilder sb = new StringBuilder();
String line = null;
while ((line = reader.readLine()) != null) {
sb.append(line + "\n");
}
is.close();
result=sb.toString();
}catch(Exception e){
Log.e("log_tag", "Error converting result "+e.toString());
}
//parse json data
try{
JSONArray jArray = new JSONArray(result);
for(int i=0;i<jArray.length();i++)
{
JSONObject json_data = jArray.getJSONObject(i);
friend=new Friend(json_data.getString("email"),json_data.getString("password"), json_data.getString("fullName"), json_data.getString("mobilePhone"), json_data.getString("mobileOperatingSystem"),"",json_data.getString("photo"));
}
}
catch(JSONException e){
Log.e("log_tag", "Error parsing data "+e.toString());
}
return friend;
}
Why not store the image directly as a BLOB?
All the conversion accomplishes is delays and extra CPU time.
Update:
Now that we know why base64 is required (since JSON can't transfer binary data), I amend my answer.
You need to check why this is taking a long time. Is it network transfer? Is it the database? Once you know the answer, we can start looking at a solution.
Base64 is 6-bit encoding: it requires 4 characters (4 bytes) to transmit 3 bytes of an image. So storing a 100kb image in Base64 takes up 133kb worth of space.
You haven't said which database you're using, but not all databases perform well if you store more than 8kb per row.