Handling UIActivityViewController - objective-c

First I couldn't find anyone else having this problem. Working on a game in spritekit - there is a mainTitle.h/m and gamePlay.h/m files. Below is code for the share button to share your progress via text, fb, twitter, ect. The code below is located in the gamePlay.m inside a touch method. The code works however after the user selects to send his/her score view text message - the new message window slides up and then the game appears to restart and load mainTitle.m scene. Any ideas as to why this happens?
-(void)share {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view.drawViewHierarchyInRect:self.view.boundsafterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSString *message = [NSString stringWithFormat:#"messge"];
NSString *urlString = [NSString stringWithFormat#"www..."];
NSURL *gmURL = [NSURL URLWithString:urlString];
UIActivityViewController *actVC = [[UIActivityViewController alloc]
initWithActivityItems:#[message, gmURL, image] applicationActivites:nil];
actVC.excludedActivityTypes = #[UIActivityTypePrint, UIActivityTypeAriDrop];
UIViewController *viewControl = self.view.window.rootViewController;
[viewControl presentViewController:actVC animated:YES completion:nil];
}
-(void)touchBegins ... {
[self share];
}

Probably its not a good idea to call share method in touchBegan, since it might be fired multiple times in some conditions. Use i.e. UIButton instead

Related

Objective C iOS WKWebView Background

Hopefully someone can help.
I've got an application that currently uses the UIWebview to play audio, which works even if the device is locked as I enable mediaPlaybackAllowsAirPlay.
However I started to move the code to use the new WKWebView and I've enabled mediaPlaybackAllowsAirPlay, However when I press the lock button, the WebView stops playing the audio.
Any ideas how I can get WKWebview to behave like the UIWebView?
Update, Here's an example of my code:
WKUserContentController *wkUController = [[WKUserContentController alloc] init];
WKWebViewConfiguration *theConfiguration = [[WKWebViewConfiguration alloc] init];
theConfiguration.userContentController = wkUController;
theConfiguration.allowsAirPlayForMediaPlayback = YES;
webViewActive = [[WKWebView alloc] initWithFrame:viewForWeb.frame configuration:theConfiguration];
webViewActive.navigationDelegate = self;
webViewActive.frame = viewForWeb.bounds;
[webViewActive setAutoresizingMask: UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth];
[viewForWeb addSubview:webViewActive];
NSURL *nsurl=[NSURL URLWithString:#"URL TO AUDIO"];
NSURLRequest *nsrequest=[NSURLRequest requestWithURL:nsurl];
[webViewActive loadRequest:nsrequest];
Thanks

MPMoviePlayerController not working properly

I am making an application in which i am trying to play a video. The video starts properly but the video screen changes in black color after 4 sec. I dont know what is the problem.
also when i am setting player.movieplayer.shouldautoplay = NO, there is no effect of this line, video starts automatically.
Here is Code:
NSString *urlString = [[NSBundle mainBundle] pathForResource:#"Movie" ofType:#"m4v"];
NSURL *urlObj = [NSURL fileURLWithPath:urlString];
UIGraphicsBeginImageContext(CGSizeMake(1,1));
MPMoviePlayerViewController *player = [[MPMoviePlayerViewController alloc] initWithContentURL:urlObj];
UIGraphicsEndImageContext();
[player.view setBounds:self.view.bounds];
// when playing from server source type shoud be MPMovieSourceTypeStreaming
[player.moviePlayer setMovieSourceType:MPMovieSourceTypeStreaming];
[player.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
player.moviePlayer.shouldAutoplay = NO;
[self.view addSubview:player.view];
[player.moviePlayer play];
Am i missing something here??
I tried to get the total duration of video (using duration property of mpmovieplayercontroller) but its showing 0.0. how to get the duration of video??
NSString *urlString = [[NSBundle mainBundle] pathForResource:#"Movie" ofType:#"m4v"];
NSURL *urlObj = [NSURL fileURLWithPath:urlString];
UIGraphicsBeginImageContext(CGSizeMake(1,1));
MPMoviePlayerViewController *player = [[MPMoviePlayerViewController alloc] initWithContentURL:urlObj];
UIGraphicsEndImageContext();
[player.view setBounds:self.view.bounds];
// when playing from server source type shoud be MPMovieSourceTypeStreaming
[player.moviePlayer setMovieSourceType:MPMovieSourceTypeStreaming]; // I was missing this line therefore video was not playing
[player.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
[self.view addSubview:player.view];
[player.moviePlayer play];
There are several issues here:
For this type of usage (integrating the player into your view), you should be using MPMoviePlayerController, not MPMoviePlayerViewController. Use MPMoviePlayerViewController when you want to have a self-contained view controller which can be presented using presentMoviePlayerViewControllerAnimated:.
Assuming you are using ARC, the main problem is that nothing is keeping a reference to your player object. As a consequence, the player is disappearing shortly after you create it. You should keep a reference to it by assigning it to a property or instance variable of your view controller.
For a full example of this, see Till's excellent answer to a similar question.
I'm not sure what your intended purpose of the UIGraphicsBeginImageContext and UIGraphicsEndImageContext calls are, but I can't see that they're needed here.
As for shouldAutoplay = NO, the video is still starting because you are calling play immediately afterwards.
The player's duration property only contains a useful value after a MPMovieDurationAvailableNotification has been received. You'll need to do something similar to the following to have access to the actual duration:
__weak MediaPlayerController *weakSelf = self;
[[NSNotificationCenter defaultCenter] addObserverForName:MPMovieDurationAvailableNotification object:self.player queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification *note) {
NSLog(#"Movie duration: %lf", weakSelf.player.duration);
}];
Use removeObserver:name:object: to remove the observer when you are done.

ALAssetRepresentation fullResolutionImage never returns

I am writing a multithreaded application which needs to upload photos from the ALAssetsLibrary en masse in the background. So I have an NSOperation subclass which finds the appropriate ALAsset via the asset's URL and adds the image to an upload queue.
In the upload queue for the current ALAsset, I need to get the metadata from the image, but I've encountered a problem: both the -metadata and the -fullResolutionImage methods never return when they are called on the ALAssetRepresentation of the ALAsset. They simply hang there indefinitely. I tried printing the value of each of these methods in LLDB, but it hung the debugger up, and I ended up killing Xcode, signal 9 style. These methods are being called on a background queue.
I am testing these on an iPad 2. This is the method in which the ALAsset is added to the upload queue when it is found in the success block of -assetForURL:resultBlock:failureBlock:
- (void)addMediaToUploadQueue:(ALAsset *)media {
#autoreleasepool {
ALAssetRepresentation *defaultRepresentation = [media defaultRepresentation];
CGImageRef fullResolutionImage = [defaultRepresentation fullResolutionImage];
// Return if the user is trying to upload an image which has already been uploaded
CGFloat scale = [defaultRepresentation scale];
UIImageOrientation orientation = [defaultRepresentation orientation];
UIImage *i = [UIImage imageWithCGImage:fullResolutionImage scale:scale orientation:orientation];
if (![self isImageUnique:i]) return;
NSDictionary *imageDictionary = [self dictionaryForAsset:media withImage:i];
dispatch_async(self.background_queue, ^{
NSManagedObjectContext *ctx = [APPDELEGATE createManagedObjectContextForThread];
[ctx setUndoManager:nil];
[ctx performBlock:^{
ImageEntity *newImage = [NSEntityDescription insertNewObjectForEntityForName:#"ImageEntity"
inManagedObjectContext:ctx];
[newImage updateWithDictionary:imageDictionary
inManagedObjectContext:ctx];
[ctx save:nil];
[APPDELEGATE saveContext];
dispatch_async(dispatch_get_main_queue(), ^{
[self.fetchedResultsController performFetch:nil];
});
if (!currentlyUploading) {
currentlyUploading = YES;
[self uploadImage:newImage];
}
}];
});
}
}
I had a similar problem and I was tearing my hair out trying to figure it out.
Turns out while I had thought I setup a singleton for ALAssetsLibrary, my code was not calling it properly and some ALAssets were returning an empty 'fullResolutionImage'
In all of my NSLogs I must have missed the most important message from Xcode:
"invalid attempt to access ALAssetPrivate past the lifetime of its owning ALAssetsLibrary"
Follow this link
http://www.daveoncode.com/2011/10/15/solve-xcode-error-invalid-attempt-to-access-alassetprivate-past-the-lifetime-of-its-owning-alassetslibrary/
I hope that helps

how load many photos from url in background (asynchronous)

i have this method i use to load many images to scroll view, but when the images load from the url my scroll view is stuck and i can't understand how to load them in the background so the user will not feel it.
this method will call few times (8) in "for" cycle.
- (void)loadPhotosToLeftscroll{
//getting image information from JSON
NSMutableDictionary *photoDict;
photoDict = [leftPhotoArray lastObject];
//getting the photo
NSString *photoPath = [photoDict objectForKey:#"photos_path"];
NSLog(#"photo Path:%#",photoPath);
NSData * imageData = [[NSData alloc] initWithContentsOfURL:[NSURL URLWithString:photoPath]];
UIImage *image = [UIImage imageWithData:imageData];
// use the image how you like, say, as your button background
//calculating the hight of next photo
UIImageView *leftImage = [leftBlockScroll.subviews lastObject];
//allocating photoView
UIImageView *photoView = [[UIImageView alloc]initWithFrame:CGRectMake(5 , leftImage.frame.origin.y + leftImage.frame.size.height+5, image.size.width/2, image.size.height/2 )];
photoView.userInteractionEnabled=YES;
[photoView.layer setMasksToBounds:YES];
[photoView.layer setCornerRadius:3];
//getting items list
NSDictionary *sh_items = [photoDict objectForKey:#"items"];
//adding image button
UIButton *imageOverButton = [UIButton buttonWithType:UIButtonTypeCustom];
imageOverButton.frame = CGRectMake(0, 0, photoView.frame.size.width, photoView.frame.size.height);
[imageOverButton addTarget:self action:#selector(LeftimagePressed:) forControlEvents:UIControlEventTouchUpInside];
[imageOverButton setTag:[leftPhotoArray count]-1];
[photoView addSubview:imageOverButton];
//adding sh button to imageView
[self addThe_sh_signs:sh_items To_ImageView:photoView];
//subViewing the image to the scrollView
[self insert_Image:image toImageView:photoView in_Scroll:leftBlockScroll];
//calclulating the position of next imageView in scroll.
nextLeftPhotoHight = photoView.frame.size.height + photoView.frame.origin.y + 5;
//calculating the hight of the highest scroll view in both.
leftBlockScroll.contentSize = CGSizeMake(160, [self theSizeOfScrollViewHight]);
rightBlocScroll.contentSize = CGSizeMake(160, [self theSizeOfScrollViewHight]);
isLoadindContant = NO;
[self.view reloadInputViews];
[leftBlockScroll reloadInputViews];
}
please do not send me to some link that trying to explain how to use the asynchronous.
Try to explain according the method you see here.
Im here for any question, that you need to ask to help me.
You will have to do it asynchronously in a proper way. I do not think there is any way around that. I subclassed an UIImageView object and placed many instances of it within the cells of a talbe (in your case within the scroll view). The subclass objects are initialized with an url and load their image asynchronously (with some caching so that the image is not loaded every time).
This tutorial helped me much in the beginning:
http://www.markj.net/iphone-asynchronous-table-image/
You will just have to adopt that to your scroll view. The underlaying principle remains the same.

Using in-memory UIWebView to generate PDF in PhoneGap

I'm trying to work out how to do this.
NOTE: I'm not an experienced objective-c developer (hence why I'm using PhoneGap in the first place)
The short of it: My UIWebView (no, not the PhoneGap one that renders the webapp, a 2nd UIWebView created in-memory and not visible) is not rendering into the PDF. I just get an blank PDF. I'll post some of my thinking and code, and hopefully someone will know what I'm doing wrong.
My starting place is that there is already a print plugin for PhoneGap here:
https://github.com/phonegap/phonegap-plugins/tree/master/iPhone/PrintPlugin
This plugin creates a UIWebView on-the-fly, you pass some HTML to it via JavaScript, and then it calls some print controller to do the printing.
So I borrowed some ideas from that. Then I noticed this awesome blog post on generating PDF's
http://www.ioslearner.com/convert-html-uiwebview-pdf-iphone-ipad/
So I'm trying to combine the two into my own PhoneGap plugin for taking some HTML (from my webapp) and generating a PDF on-the-fly.
HEADER:
#import <Foundation/Foundation.h>
#import <QuartzCore/QuartzCore.h>
#ifdef PHONEGAP_FRAMEWORK
#import <PhoneGap/PGPlugin.h>
#else
#import "PGPlugin.h"
#endif
#interface ExportPlugin : PGPlugin <UIWebViewDelegate> {
NSString* exportHTML;
}
#property (nonatomic, copy) NSString* exportHTML;
//This gets called from my HTML5 app (Javascript):
- (void) exportPdf:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options;
#end
MAIN:
#import "ExportPlugin.h"
#interface ExportPlugin (Private)
-(void) doExport;
-(void) drawPdf;
#end
#implementation ExportPlugin
#synthesize exportHTML;
- (void) exportPdf:(NSMutableArray*)arguments withDict:(NSMutableDictionary*)options{
NSUInteger argc = [arguments count];
if (argc < 1) {
return;
}
self.exportHTML = [arguments objectAtIndex:0];
[self doExport];
}
int imageName = 0;
double webViewHeight = 0.0;
- (void) doExport{
//Set the base URL to be the www directory.
NSString *dbFilePath = [[NSBundle mainBundle] pathForResource:#"www" ofType:nil ];
NSURL *baseURL = [NSURL fileURLWithPath:dbFilePath];
//Load custom html into a webview
UIWebView *webViewExport = [[UIWebView alloc] init];
webViewExport.delegate = self;
//[webViewExport loadHTMLString:exportHTML baseURL:baseURL];
[webViewExport loadHTMLString:#"<html><body><h1>testing</h1></body></html>" baseURL:baseURL];
}
- (BOOL)webView:(UIWebView *)theWebView shouldStartLoadWithRequest:(NSURLRequest *)request navigationType:(UIWebViewNavigationType)navigationType
{
return YES;
}
- (void)webViewDidFinishLoad:(UIWebView *)webViewExport
{
webViewHeight = [[webViewExport stringByEvaluatingJavaScriptFromString:#"document.body.scrollHeight;"] integerValue];
CGRect screenRect = webViewExport.frame;
//WHY DO I HAVE TO SET THE SIZE? OTHERWISE IT IS 0
screenRect.size.width = 768;
screenRect.size.height = 1024;
double currentWebViewHeight = webViewHeight;
while (currentWebViewHeight > 0)
{
imageName ++;
UIGraphicsBeginImageContext(screenRect.size);
CGContextRef ctx = UIGraphicsGetCurrentContext();
//[[UIColor blackColor] set];
//CGContextFillRect(ctx, screenRect);
[webViewExport.layer renderInContext:ctx];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *pngPath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%d.png",imageName]];
if(currentWebViewHeight < 960)
{
CGRect lastImageRect = CGRectMake(0, 960 - currentWebViewHeight, webViewExport.frame.size.width, currentWebViewHeight);
CGImageRef imageRef = CGImageCreateWithImageInRect([newImage CGImage], lastImageRect);
newImage = [UIImage imageWithCGImage:imageRef];
CGImageRelease(imageRef);
}
[UIImagePNGRepresentation(newImage) writeToFile:pngPath atomically:YES];
[webViewExport stringByEvaluatingJavaScriptFromString:#"window.scrollBy(0,960);"];
currentWebViewHeight -= 960;
}
[self drawPdf];
}
- (void) drawPdf
{
CGSize pageSize = CGSizeMake(612, webViewHeight);
NSString *fileName = #"Demo.pdf";
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *pdfFileName = [documentsDirectory stringByAppendingPathComponent:fileName];
UIGraphicsBeginPDFContextToFile(pdfFileName, CGRectZero, nil);
// Mark the beginning of a new page.
UIGraphicsBeginPDFPageWithInfo(CGRectMake(0, 0, pageSize.width, pageSize.height), nil);
double currentHeight = 0.0;
for (int index = 1; index <= imageName ; index++)
{
NSString *pngPath = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:#"%d.png", index]];
UIImage *pngImage = [UIImage imageWithContentsOfFile:pngPath];
[pngImage drawInRect:CGRectMake(0, currentHeight, pageSize.width, pngImage.size.height)];
currentHeight += pngImage.size.height;
}
UIGraphicsEndPDFContext();
}
#end
The first indication something is not right, is above I have to set the UIWebView.frame size:
screenRect.size.width = 768;
screenRect.size.height = 1024;
But why? The PhoneGap PrintPlugin doesn't have to do this. If I don't set it, the size is 0, and then I get lots of context errors.
And then the next problem is that the UIWebView is not rendering anything. A symptom of the first problem perhaps?
How do I go about debugging this and working out what the problem is?
UPDATE
I'm pretty sure that it may be impossible to render the UIWebView layer into the image context, unless that UIWebView is actually visible.
I'm not sure how the PhoneGap PrintPlugin works then. It seems to render it's UIWebView quite fine with it not being visible.
I'm currently experimenting with rendering the actual PhoneGap UIWebView into the PDF (as opposed to my own UIWebView). But this is not ideal.
It means I have to hide all toolbars and whatnot, and then pan the UIWebView around so I capture everything outside the viewport. This is not ideal, because the user will visually see this occurring!
Point 1 above doesn't seem to work anyway, because iPad is too slow to update the screen when dynamically fiddling with the layout. On iPad, if you do visual things very quickly, (like panning the screen around) the iPad is too slow and just wont show it. You end up only seeing the end state. So when I take the screenshots, the screen visually hasn't panned (even though the DOM says it has). (Hope that makes sense).
Agh, frustrating.
I've got a working solution now, but it's not ideal.
What I do is render the phonegap UIWebView into the PDF.
To do this is quite tricky. I have a couple of objective-c functions
- (void) takeScreenshot;
- (void) renderPdf;
that I call from Javascript.
Then I have to write a recursive JS algorithm that pans the screen in every direction whilst calling takeScreenshot.
In between calls to takeScreenshot I use setTimeout which gives a 20 millisecond break in the JS processing - enough time for the iPad to update the screen so the next screenshot can be taken.
It was a royal pain in the arse. Bounty is still open in case someone knows of a better way of dealing with this - I would be very curious to know!
If you want to render a UIWebView into a PDF, I think you could go for this :
1/ use the convertRect:fromView method implemented by your UIWebView object to get the CGRect
2/ see the UIPrintPageRenderer Class Reference to make like a print preview
3/ Use UIGraphicsGetCurrentContext to get the CGContextRef out of it
4/ create the PDF from the CGRect and CGContextRef (you can use the help provided in the Apple sample code zoomingPDFViewer for building PDF using CGPdf
Here is how to render PDF in a UIWebView (webPage being your UIWebView), your delegate (here "self") could implement the UIWebViewDelegate protocol :
- (void)loadingPDFwithURL:(NSURL *)anURL {
CGRect appFrame = [[UIScreen mainScreen] applicationFrame];
appFrame.origin.y = 0;
[self.webPage initWithFrame:appFrame];
[self.webPage setScalesPageToFit:YES];
[self.webPage setDelegate:self];
NSURLRequest *requestObj = [NSURLRequest requestWithURL:anURL];
[self.webPage loadRequest:requestObj];
[self.view addSubview:self.webPage];
}
So you're giving the URL of the PDF (if it's in memory, just write the file in your application as the URL can be a filepath).
Another possibility is to dive into CGPdf but it's getting harder.