How to run multiple task at the same time safely? - objective-c

I'm developing a iPhone app and I'm relatively new to objective-c so I hope some one can give a clue.
What im doing is reading a file in chunks and encoding the chuncks into base64 and everything is working fine, the problem is that in this line NSString *str = [data base64EncodedString]; it takes a little bit of time because im encodeing chunks of 256KB, there is no problem with one file the problem is that i'm encoding image files so imagine that I encode 10 images it will be alot of chunks per image so the process can be slow.
this is the process:
*Get the file.
*Read chunck of 256KB of the file.
*Encode chunck to base64.
*Save the encoded chunck and repeat until there is no more bytes to read from the file.
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library assetForURL:referenceURL resultBlock:^(ALAsset *asset)
{
NSUInteger chunkSize =262144;
uint8_t *buffer = calloc(chunkSize, sizeof(*buffer));
ALAssetRepresentation *rep = [asset defaultRepresentation];
NSUInteger length = [rep size];
self.requestsToServer=[[NSMutableArray alloc]init];
NSUInteger offset = 0;
do {
NSUInteger bytesCopied = [rep getBytes:buffer fromOffset:offset length:chunkSize error:nil];
offset += bytesCopied;
NSData *data = [[NSData alloc] initWithBytes:buffer length:bytesCopied];
NSString *str = [data base64EncodedString];
//After this I add the str in a NSMutableURLRequest and I store the request
//in a NSMutableArray for later use.
} while (offset < length);
free(buffer);
buffer = NULL;
}
failureBlock:^(NSError *error)
{
}];
I want to start another thread so I can be encoding the chuncks in paralel and know when the process finish, this way while encoding one chunck I can be encodign another 3 or 4 chunks at the same time.
How I can implement this in a safely way or is this a good idea?
Thanks for your time.

Look at NSOperation and NSOperationQueue.
http://developer.apple.com/library/ios/DOCUMENTATION/Cocoa/Reference/NSOperationQueue_class/Reference/Reference.html
Simply create one NSOperation per chunk and pass them the chunk they need to encode and queue them up.
You can tell the queue how many operations can run simultaneously.

There are lots of good options for doing chunks of work in parallel for iOS.
Take a look at the Apple's Concurrency Programming Guide to get you going.

Related

Nsdata to bytearray conversion. Memory issue and crash on iPad physical device

I am using the following code to convert nsdata to bytearray. It works fine in simulator. On device, it allocates memory like crazy to 600 MB [on the 'addobject' line inside the loop] and crashes. The file size I am reading is 30 MB. The error I see in output windows is "memory issue". The file is a "zip" file
NSData *data = [[NSData alloc] initWithContentsOfFile:file];
const unsigned char *bytes = [data bytes];
NSUInteger length = [data length];
NSMutableArray *byteArray = [NSMutableArray array];
for (NSUInteger i = 0; i < length; i++) {
#autoreleasepool {
[byteArray addObject:[NSNumber numberWithUnsignedChar:bytes[i]]]; }
}
I am using this bytearray inside a NSDictionary and do a "dataWithJSONObject" on the dictionary to post the json to a REST web service.
For a more memory efficient way to convert NSData to byte array, see How to convert NSData to byte array in iPhone?
If your goal is ultimately to post the binary data to a web service in JSON, try the following:
Base64-encode the the data in a String, as described here. This can be passed in the JSON Body.
Alternatively, if you have flexibility on changing the server-side of this transfer, avoid base64-encoded data in a JSON and rather directly post the binary content using an HTTP POST or PUT. This will be more efficient.
For the second approach, I can think of two ways to do this:
Here is an example for the scenario of sending an image, but any file type will work so long as you can load it into an NSData object.
Or, something like this should work
NSData *data = [NSData dataWithContentsOfFile:file];
NSURL *url = [NSURL URLWithString:#"your_url_here"];
NSMutableURLRequest *request = [NSMutableURLRequest requestWithURL:url];
[request setHTTPMethod:#"PUT"];
NSDictionary *headers = #{#"Content-Type": #"application/octet-stream"};
[request setAllHTTPHeaderFields:headers];
[request setHTTPBody:data]
// Use an URLSession object task to execute the request
Lastly, if the data can be compressed on the client before encoding and sending, that would be even better!
I hope you find this helpful.

Cocoa App using much more Memory than it should

I'm developing an App that downloads and parses around 40k json files from an API. Due to the structure of my program, i don't parse them immediately, but save them in an NSMutable array. Each json is around 1KB, most of them even less. If my calculation is correct, that should produce around 40MB (+ some overhead) of allocated memory. But when i run the App, the memory usage climbs up to over 4GB.
Am i leaking something here? Since i use garbage collection, i should not need to dealloc anything, right? Or is my calculation simply wrong?
Here my code:
- (void) loadItemsForIds:(NSArray*)idList {
for (NSNumber* n in idList) {
NSURL *url = [NSURL URLWithString:[NSString stringWithFormat:#"https://someapi.com/somejson.json?id=%#", n]];
NSData* response = [NSData dataWithContentsOfURL:url];
NSDictionary *loadedData = [NSJSONSerialization JSONObjectWithData:response options:0 error:nil];
#synchronized(self.updateData) {
[self.updateData addObject:loadedData];
}
[self performSelectorOnMainThread:#selector(progressLoadingInterface:) withObject:[loadedData valueForKey:#"name"] waitUntilDone:NO];
};
}
Edit:
Found out that the problem even exists, if i don't save the data into the array. After some research i stumbled upon this question which deals with the same problem: iPhone - Memory Leak - NSData dataWithContentsOfUrl & UIWebView
My question might need to be deleted then.
[NSJSONSerialization JSONObjectWithData:response options:0 error:nil] parses the JSON data into objects. This will take up more memory than the 1K of JSON data.
In addition, NSString is likely storing all your string data as unichar, which is a 16-bit representation: double the size of the UTF-8 encoding assumed in the JSON data.
Depending on the contents of the JSON, I can believe that 1K of JSON data could become 10K of objects.
But this is all guess work. Use Instruments (Product > Profile) to check memory usage. That will give you a better idea what objects are taking up all the memory.
you never release memory between the 40k iterations.
=> You never give anybody the chance to do so either (arc/gc)
wrap the loop body in an #autoreleasepool {...} so memory is drained
for (NSNumber* n in idList) {
#autoreleasepool {
NSURL *url = [NSURL URLWithString:[NSString stringWithFormat:#"https://someapi.com/somejson.json?id=%#", n]];
NSData* response = [NSData dataWithContentsOfURL:url];
NSDictionary *loadedData = [NSJSONSerialization JSONObjectWithData:response options:0 error:nil];
#synchronized(self.updateData) {
[self.updateData addObject:loadedData];
}
[self performSelectorOnMainThread:#selector(progressLoadingInterface:) withObject:[loadedData valueForKey:#"name"] waitUntilDone:NO];
}
}

Appending random bytes to an image doesn't change the outcome?

So to simplify the situation I want to append a couple random bytes to an image to change the MD5 hash every time.
I have the code set up to look up the image then create an NSImage. After that it converts the NSImage to NSMutableData which offers me the opportunity to append random bytes. I then end it all by exporting the altered NSImage to the desktop.
That all works fine and dandy until I run my program twice and compare the MD5 hashes of the two outputs. They are exactly the same! It doesn't matter if I append 1 or 1000 random bytes, if you compare the two outputs, it is exactly the same to each other.
My code:
- (void)createNewImage:(NSString *)filePath {
// NSImage from path
NSImage *newImage = [[NSImage alloc]initWithContentsOfFile:filePath];
// NSData to NSMutableData
NSData *imgData = [newImage TIFFRepresentation];
NSMutableData *mutableData = [imgData mutableCopy];
// Get the random bytes
NSData *randomData = [self createRandomBytes:10];
// Append random data to new image
[mutableData appendData:randomData];
(etc...)
// Create file path for the new image
NSString *fileName = #"/Users/Computer/Desktop/MD5/newImage.jpg";
// Cache the new image
NSBitmapImageRep *imageRep = [NSBitmapImageRep imageRepWithData:mutableData];
NSDictionary *imageProps = [NSDictionary dictionaryWithObject:[NSNumber numberWithFloat:1.0] forKey:NSImageCompressionFactor];
NSData *newData = [imageRep representationUsingType:NSJPEGFileType properties:imageProps];
[newData writeToFile:fileName atomically:NO];
}
-(NSData *)createRandomBytes:(NSUInteger)amount {
return [[NSFileHandle fileHandleForReadingAtPath:#"/dev/random"] readDataOfLength:amount];
}
UPDATE:
With the help of picciano, I found that exporting the edited NSData directly manages to achieve my goal
[mutableData writeToFile:fileName atomically:NO];
HOWEVER, the image is significantly larger. The source image is 182 KB while the new images are 503 KBs. picciano's answer explains why this happens but does anyone happen to have a workaround to the inflation?
You are adding random data, but it is not being used in creating the image. When the image is converted back to a JPG data representation, only the valid portion of the image data is used.
To verify this, check the length of your newData object.

appending erased file has too much disturbed noice

I am erasing audio file using following code:
NSMutableData *wave =[NSMutableData dataWithContentsOfURL:self.recordedFileURL options:NSDataReadingUncached error:nil];
NSUInteger length = [wave length];
Byte *byteData = (Byte*)malloc(length);
memcpy(byteData, [wave bytes], length);
NSMutableData *data = [NSMutableData dataWithBytes:byteData length:length];
[data replaceBytesInRange:NSMakeRange(length*rangeToCut, length-(length*rangeToCut)) withBytes:NULL length:0];
[data writeToFile:[self.recordedFileURL path] atomically:YES];
it is erasing correctly but after that when i resume my audio ad append resumed part to old part like following:
NSMutableData *part2=[NSMutableData dataWithContentsOfURL:self.soundFileURL options:NSDataReadingUncached error:nil];
NSFileHandle *file = [NSFileHandle fileHandleForWritingToURL:oldURL error:nil];
if (file) {
[file seekToEndOfFile];
[file writeData:part2];
}
then audio files are appended successfully but resumed part of audio has too much disturbance not able to listen that part.
Please help me what is going wrong here.
Is your sample size 16 bits or more? If you cut the audio in the middle of a sample the rest of the stream will be just noise. You need to be sure of length*rangeToCut being a multiple of the sample size.

convert ciimage to writable format

I need to convert CIImage to a format which could be written to disk.
Currently I am using the following code to convert it to JPG format .
NSBitmapImageRep* rep = [[[NSBitmapImageRep alloc] initWithCIImage:result] autorelease];
NSData *JPEGData = [rep representationUsingType:NSJPEGFileType properties:nil];
[JPEGData writeToFile:targetPath atomically:YES];
But the real memory usage shoots up to above 100 MB . My application requires me to handle a large number of images so i need to optimise my memory usage.
Can anyone please suggest anything ???
If it's the cumulative memory that's an issue, and not just one image, you can try wrapping the two last lines in your own autorelease pool:
#autoreleasepool {
NSData *JPEGData = [rep representationUsingType:NSJPEGFileType properties:nil];
[JPEGData writeToFile:targetPath atomically:YES];
}