Implementing Progress Bar While Uploading the Image - objective-c

I am uploading the image and some string to server , it is working fine ,now i am want to implement the Progress bar,if i am sending 5 images means ,i want to show the progress bar and i want to show the count of images sended successfully ,like 2 /5 ,Please anyone help me to do this.
The follwoing method is for uploading image ,dictionary , and string
-(void)uploadImage
{
NSString *userCategory = self.UserCategory;
NSDictionary *dict = [self.arrayWithImages objectAtIndex:self.currentIndex];
NSString *notes = [dict objectForKey:#"string"];
UIImage *sample = [dict objectForKey:#"image"];
NSData *sampleData = UIImageJPEGRepresentation(sample, 1.0);
NSMutableDictionary *FinalDict = [self.dictMetaData mutableCopy];
[FinalDict setObject:userCategory forKey:#"user_category"];
if (notes.length > 0) {
[FinalDict setObject:notes forKey:#"note"];
}
for (int i = 0; i<self.arrayWithImages.count; i++) {
[ServerUtility uploadImageWithAllDetails:FinalDict noteResource:sampleData andCompletion:^(NSError *error,id data)
{
if (!error) {
NSString *strResType = [data objectForKey:#"res_type"];
if ([strResType.lowercaseString isEqualToString:#"success"]) {
NSLog(#"Upload Successfully");
self.currentIndex++;
}
else if ([strResType.lowercaseString isEqualToString:#"error"])
{
NSString *strMsg = [data objectForKey:#"msg"];
[self.view makeToast:strMsg duration:1.0 position:CSToastPositionCenter];
}
}
else{
[self.view makeToast:error.localizedDescription duration:1.0 position:CSToastPositionCenter];
}
}];
}

Related

NSProgressIndicator within the NSTableViewCell not update

There is a NSTableView which contains several cells. There are a NSTextField and NSProgressIndicator in the cell. Each cell has its own progress indicator which represent its own uploading percentage.
Now, I need to update the progress indicator according the uploading percentage so that we could know the progress of uploading.
- (void)startUploading:(NSArray *)filePathArray
{
if (self.QNToken) {
NSString *token = self.QNToken;
NSUInteger count = [filePathArray count];
// We may have multiple files to upload.
for (int i = 0; i < count; i ++) {
NSString *filePath = filePathArray[i];
NSData *fileData = [NSData dataWithContentsOfFile:filePath];
NSString *key = [fileData md5String];
// Upload option
QNUploadOption *option = [[QNUploadOption alloc] initWithMime:nil
progressHandler:^(NSString *key, float percent) {
QNUploadDetailCell *cell = [self.tableView viewAtColumn:0 row:i makeIfNecessary:NO];
[cell.progress.animator setDoubleValue:percent];
[cell.progress displayIfNeeded];
}
params:nil
checkCrc:NO
cancellationSignal:nil];
// Let's upload!
[self.uploadManager putFile:filePath
key:key
token:token
complete: ^(QNResponseInfo *info, NSString *key, NSDictionary *resp) {
NSLog(#"%#", info);
NSLog(#"%#", resp);
} option:option];
}
}
}
However, the progress indicator will never be animated.
I also print out the percent of the progressHandler: the percentage just keep increasing from 0 to 1.0
But the progress bar still keep stopping.
Any ideas?

How to edit metadata of image in iOS8?

ALAssetsLibrary *lib = [[ALAssetsLibrary alloc] init];
[lib assetForURL:nil resultBlock:^(ALAsset *asset) {
NSDictionary *metadata = rep.metadata;
if (metadata) {
NSDictionary *GPSDict=metadata[#"{GPS}"];
NSDictionary *TIFFDict=metadata[#"{TIFF}"];
if (GPSDict){
double longitude = [[GPSDict objectForKey:#"Longitude"] doubleValue];
double latitude = [[GPSDict objectForKey:#"Latitude"] doubleValue];
if ([[GPSDict objectForKey:#"LatitudeRef"] isEqualToString:#"S"]) {
latitude = -latitude;
}
if ([[GPSDict objectForKey:#"LongitudeRef"] isEqualToString:#"W"]) {
longitude = -longitude;
}
if (TIFFDict){
NSUserDefaults *pref = [NSUserDefaults standardUserDefaults];
[pref setObject:[TIFFDict objectForKey:#"DateTime"] forKey:#"PHOTODATE"];
[pref synchronize];
}
coordinate2D = CLLocationCoordinate2DMake(latitude, longitude);
}else {
latitude = locationManager.location.coordinate.latitude;
longitude = locationManager.location.coordinate.longitude;
[GPSDictionary setObject:[NSNumber numberWithFloat:fabs(latitude)]
forKey:(NSString*)kCGImagePropertyGPSLatitude];
[GPSDictionary setObject:(latitude > 0 ? #"N": #"S") forKey:(NSString*)kCGImagePropertyGPSLatitudeRef];
[GPSDictionary setObject:[NSNumber numberWithFloat:fabs(longitude)]
forKey:(NSString*)kCGImagePropertyGPSLongitude];
[GPSDictionary setObject:(longitude > 0 ? #"E": #"W") forKey:(NSString*)kCGImagePropertyGPSLongitudeRef]; //
if (metadata&& GPSDictionary) {
[metadata setValue:GPSDictionary forKey:(NSString*)kCGImagePropertyGPSDictionary];
}
coordinate2D = CLLocationCoordinate2DMake(latitude, longitude);
}
}
else
{
}
} failureBlock:^(NSError *error) {
//User denied access
NSLog(#"Unable to access image: %#", error);
}];
I am using above code to get metadata of image.But now i want to edit this metadata.I want to add custom location in image if the location information is not present in the {GPS} dictionary.
From the apple’s documentation : Applications are only allowed to edit assets that they originally wrote. So if your application is writing image in Photos Library then only you will be able to edit its metadata.
You can check whether metadata is editable or not using ALAsset’s editable property.
I was able to update metadata using setImageData:metadata:completionBlock: method.
Please refer following code :
I am passing same image data with updated metadata. Also I have tested changing orientation not the GPS data but this code helps you to start :
ALAsset *asset = // your asset
if(asset.editable) {
NSDictionary *metadata = asset.defaultRepresentation.metadata;
NSDictionary *gpsInfo=metadata[#"{GPS}"];
if (gpsInfo) {
NSMutableDictionary *mutableGPSInfo = [gpsInfo mutableCopy];
[mutableGPSInfo setObject:#"yourNewLatitude" forKey:#"Latitude"];
[mutableGPSInfo setObject:#"yourNewLongitude" forKey:#"Longitude"];
NSMutableDictionary *mutableMetadata = [metadata mutableCopy];
[mutableMetadata setObject:[mutableGPSInfo copy] forKey:#"{GPS}"];
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc(rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:rep.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];
[asset setImageData:data metadata:[mutableMetadata copy] completionBlock:^(NSURL *assetURL, NSError *error) {
if (error) {
NSLog(#"Error : %#",error);
} else {
NSLog(#"Asset metadata is successfully edited");
}
}];
}
} else {
NSLog(#"oops..! Asset can not be edited");
}

[AVFoundation]: Thumbnail generation hangs after some time

I have server application which serves informations about the videos on server. One of the requests is URL:PORT/video/:id/:time ... which I parse and get the video file, prepare the time and ask method to generate the thumbnail. It works for first 5 minutes really fast (generates image under 200ms), then the process of image generation suddenly takes even 10 seconds...
Do you have any idea why ? Code used:
-(NSImage *)thumbnailAt: (CMTime) time
withSize: (NSSize) size
error: (NSError **) error {
#autoreleasepool {
if ( self.assetChanged ) {
self.generate = [[AVAssetImageGenerator alloc] initWithAsset:_asset];
self.generate.appliesPreferredTrackTransform = TRUE;
self.assetChanged = NO;
}
self.generate.maximumSize = NSSizeToCGSize(size);
CGImageRef imageReference = [self.generate copyCGImageAtTime:time actualTime:NULL error:error];
if ( imageReference != nil ) {
NSImage* ret = [[NSImage alloc] initWithCGImage:imageReference size:size];
CGImageRelease(imageReference);
return ret;
}
return nil;
}
}
Any idea what I am doing wrong or any suggestion how to do it differently (eg. using AVAssetReader) ?
In the end I solved it by using different approach: I used asynchronous images
-(NSImage *)createAsyncThumbnailAtTime:(CMTime)time withSize:(NSSize)size {
if ( self.updated ) {
[_generator cancelAllCGImageGeneration]; // Stop we did not comply in time.
_generator = [AVAssetImageGenerator assetImageGeneratorWithAsset:_asset];
_generator.maximumSize = NSSizeToCGSize(size);
}
NSMutableArray * times = [NSMutableArray array];
[times addObject:[NSValue valueWithCMTime:time]];
__block NSImage * image;
__block BOOL finished = NO;
[_generator generateCGImagesAsynchronouslyForTimes:times completionHandler:^(CMTime requestedTime, CGImageRef imageRef, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error) {
image = nil;
if ( result == AVAssetImageGeneratorCancelled ) {
image = nil;
NSLog(#"CANCELLED %#", error);
finished = YES;
}
else
if ( result == AVAssetImageGeneratorFailed ) {
image = nil;
NSLog(#"FAILDED %#", error);
finished = YES;
}
else /* result == AVAssetImageGeneratorSucessed */
{
image = [[NSImage alloc] initWithCGImage:imageRef size:size];
finished = YES;
}
}];
while ( !finished ) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
return thumbnail->image;
}

Parse : is it possible to follow progress of PFObject upload

I have a PFObject containing a text and a PFFile.
PFObject *post = [PFObject objectWithClassName:#"Posts"];
post[#"description"] = self.Photodescription.text;
NSData *picture = UIImageJPEGRepresentation(self.capturedPicture, 0.5f);
post[#"picture"] = [PFFile fileWithName:#"thumbnailPicture.png" data:picture];
I would like to get progress of upload in order to display a progression bar. The following function only works for PFFile.
[post[#"picture"] saveInBackgroundWithBlock:^(BOOL succeeded, NSError *error) {
}progressBlock:^(int percentDone) {
// Update your progress spinner here. percentDone will be between 0 and 100.
NSLog(#"%i", percentDone);
}];
Is there a way to do the same for a PFObject?
Are you uploading a single object or an array of objects? Right now there is no progress for a single PFObject. For very large arrays of PFObjects I did create a category for uploading a series of PFObjects in the background with progress feedback, it works like the normal saveAlInBackground: except you specific the chunkSize (How many PFObjects to save at a time until complete) and give it a progress block handler which is calls each time a chunk is completed:
+(void)saveAllInBackground:(NSArray *)array chunkSize:(int)chunkSize block:(PFBooleanResultBlock)block progressBlock:(PFProgressBlock)progressBlock
{
unsigned long numberOfCyclesRequired = array.count/chunkSize;
__block unsigned long count = 0;
[PFObject saveAllInBackground:array chunkSize:chunkSize block:block trigger:^(BOOL trig) {
count++;
progressBlock((int)(100.0*count/numberOfCyclesRequired));
}];
}
+(void)saveAllInBackground:(NSArray *)array chunkSize:(int)chunkSize block:(PFBooleanResultBlock)block trigger:(void(^)(BOOL trig))trigger
{
NSRange range = NSMakeRange(0, array.count <= chunkSize ? array.count:chunkSize);
NSArray *saveArray = [array subarrayWithRange:range];
NSArray *nextArray = nil;
if (range.length<array.count) nextArray = [array subarrayWithRange:NSMakeRange(range.length, array.count-range.length)];
[PFObject saveAllInBackground:saveArray block:^(BOOL succeeded, NSError *error) {
if(!error && succeeded && nextArray){
trigger(true);
[PFObject saveAllInBackground:nextArray chunkSize:chunkSize block:block trigger:trigger];
}
else
{
trigger(true);
block(succeeded,error);
}
}];
}

Cacheing UIImage as NSData and getting it back

I'm trying to cache images I load from Flickr. If I load the same image, I'm hoping to use the cached version instead. For some reason, when I download an image from the internet, it works, but if I load it from my cache, it displays a blank image. I checked cacheData, and it has the same amount of bits as the image I put in, so it appears loading the file is working.
Here is how I cache images:
+ (void)cachePhoto:(NSData *)photo withKey:(NSString *)key {
if (photo) {
NSArray * urlArray = [fileManager URLsForDirectory:NSCachesDirectory inDomains:NSUserDomainMask];
NSURL * targetDirectory = (NSURL *)[urlArray objectAtIndex:0];
targetDirectory = [targetDirectory URLByAppendingPathComponent:key];
[photo writeToURL:targetDirectory atomically:YES];
[cachedPhotos addObject:key];
NSLog(#"target url %#", targetDirectory);
}
}
+ (NSData *)photoInCache:(NSString *)key {
if ([cachedPhotos containsObject:key]) {
NSString * path = [[cacheDirectory URLByAppendingPathComponent:key] path];
NSLog(#"path: %#", path);
return [fileManager contentsAtPath:path];
} else {
return nil;
}
}
And my code to get it back:
NSData * cacheData = [PhotoCache photoInCache:key];
if (cacheData) {
self.imageView.image = [UIImage imageWithData:cacheData];
[spinner stopAnimating];
NSLog(#"used cached image");
} else {
dispatch_queue_t downloadQueue = dispatch_queue_create("get photo from flickr", NULL);
dispatch_async(downloadQueue, ^{
NSData * imageData = [[NSData alloc] initWithContentsOfURL:url];
dispatch_async(dispatch_get_main_queue(), ^{
[spinner stopAnimating];
self.imageView.image = [UIImage imageWithData:imageData];
[PhotoCache cachePhoto:imageData
withKey:key];
});
});
}
I figured it out - I was loading the image into the imageView in my prepareForSegue, where the ImageView was not yet loaded. I loaded the image in viewDidLoad and it worked. I also had to use UIImagePNGRepresentation, like suggested in the comments.