Objective C iOS WKWebView Background - objective-c

Hopefully someone can help.
I've got an application that currently uses the UIWebview to play audio, which works even if the device is locked as I enable mediaPlaybackAllowsAirPlay.
However I started to move the code to use the new WKWebView and I've enabled mediaPlaybackAllowsAirPlay, However when I press the lock button, the WebView stops playing the audio.
Any ideas how I can get WKWebview to behave like the UIWebView?
Update, Here's an example of my code:
WKUserContentController *wkUController = [[WKUserContentController alloc] init];
WKWebViewConfiguration *theConfiguration = [[WKWebViewConfiguration alloc] init];
theConfiguration.userContentController = wkUController;
theConfiguration.allowsAirPlayForMediaPlayback = YES;
webViewActive = [[WKWebView alloc] initWithFrame:viewForWeb.frame configuration:theConfiguration];
webViewActive.navigationDelegate = self;
webViewActive.frame = viewForWeb.bounds;
[webViewActive setAutoresizingMask: UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth];
[viewForWeb addSubview:webViewActive];
NSURL *nsurl=[NSURL URLWithString:#"URL TO AUDIO"];
NSURLRequest *nsrequest=[NSURLRequest requestWithURL:nsurl];
[webViewActive loadRequest:nsrequest];
Thanks

Related

Handling UIActivityViewController

First I couldn't find anyone else having this problem. Working on a game in spritekit - there is a mainTitle.h/m and gamePlay.h/m files. Below is code for the share button to share your progress via text, fb, twitter, ect. The code below is located in the gamePlay.m inside a touch method. The code works however after the user selects to send his/her score view text message - the new message window slides up and then the game appears to restart and load mainTitle.m scene. Any ideas as to why this happens?
-(void)share {
UIGraphicsBeginImageContextWithOptions(self.view.bounds.size, NO, 1.0);
[self.view.drawViewHierarchyInRect:self.view.boundsafterScreenUpdates:YES];
UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
NSString *message = [NSString stringWithFormat:#"messge"];
NSString *urlString = [NSString stringWithFormat#"www..."];
NSURL *gmURL = [NSURL URLWithString:urlString];
UIActivityViewController *actVC = [[UIActivityViewController alloc]
initWithActivityItems:#[message, gmURL, image] applicationActivites:nil];
actVC.excludedActivityTypes = #[UIActivityTypePrint, UIActivityTypeAriDrop];
UIViewController *viewControl = self.view.window.rootViewController;
[viewControl presentViewController:actVC animated:YES completion:nil];
}
-(void)touchBegins ... {
[self share];
}
Probably its not a good idea to call share method in touchBegan, since it might be fired multiple times in some conditions. Use i.e. UIButton instead

how to switch camera mode to video mode using GPUImage in ios

I am making app like Instagram .i am using GPUImage framework ,in this i have to take photos and videos and share. i able to capture photos using this framework and now i have to capture video but i am struggling how to change camera mode photos to video. any help and tutorial then its very good for me. I used this code camera for photos mode.
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
self.imagepicker.sourceType = UIImagePickerControllerSourceTypeCamera;
[[NSBundle mainBundle] loadNibNamed:#"OverlayView" owner:self options:nil];
self.overlayView.frame = self.imagepicker.cameraOverlayView.frame;
self.imagepicker.cameraOverlayView = self.overlayView;
self.overlayView = nil;
CGSize result = [[UIScreen mainScreen] bounds].size;
self.imagepicker.showsCameraControls = NO;
self.imagepicker.allowsEditing = NO;
self.imagepicker.wantsFullScreenLayout = NO;
// self.imagepicker.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];
}
else{
self.imagepicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
}
In my case, I'm using GPUImage to do both (pictures and videos). Therefore I've created two objects: one of type GPUImageStillCamera(pictures) and other of type GPUImageVideoCamera (videos).
So whenever you need to switch between cameras you basically stop the GPUImageStillCamera capture and initialize a video camera (note that you have to adapt this snippet to your project):
func initializeVideoCamera() {
// Stop the capture of GPUImageStillCamera
stillCamera.stopCameraCapture()
videoCamera = GPUImageVideoCamera.init(sessionPreset: AVCaptureSessionPreset1920x1080, cameraPosition: .Back)
videoCamera?.outputImageOrientation = .Portrait
videoCamera?.addTarget(filter)
// If a file already exists, AVAssetWriter won't let you record new frames, so delete the old movie
unlink(pathToMovieFile)
initializeWriteWithPath(pathToMovieFile)
videoCamera?.startCameraCapture()
}

MPMoviePlayerController not working properly

I am making an application in which i am trying to play a video. The video starts properly but the video screen changes in black color after 4 sec. I dont know what is the problem.
also when i am setting player.movieplayer.shouldautoplay = NO, there is no effect of this line, video starts automatically.
Here is Code:
NSString *urlString = [[NSBundle mainBundle] pathForResource:#"Movie" ofType:#"m4v"];
NSURL *urlObj = [NSURL fileURLWithPath:urlString];
UIGraphicsBeginImageContext(CGSizeMake(1,1));
MPMoviePlayerViewController *player = [[MPMoviePlayerViewController alloc] initWithContentURL:urlObj];
UIGraphicsEndImageContext();
[player.view setBounds:self.view.bounds];
// when playing from server source type shoud be MPMovieSourceTypeStreaming
[player.moviePlayer setMovieSourceType:MPMovieSourceTypeStreaming];
[player.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
player.moviePlayer.shouldAutoplay = NO;
[self.view addSubview:player.view];
[player.moviePlayer play];
Am i missing something here??
I tried to get the total duration of video (using duration property of mpmovieplayercontroller) but its showing 0.0. how to get the duration of video??
NSString *urlString = [[NSBundle mainBundle] pathForResource:#"Movie" ofType:#"m4v"];
NSURL *urlObj = [NSURL fileURLWithPath:urlString];
UIGraphicsBeginImageContext(CGSizeMake(1,1));
MPMoviePlayerViewController *player = [[MPMoviePlayerViewController alloc] initWithContentURL:urlObj];
UIGraphicsEndImageContext();
[player.view setBounds:self.view.bounds];
// when playing from server source type shoud be MPMovieSourceTypeStreaming
[player.moviePlayer setMovieSourceType:MPMovieSourceTypeStreaming]; // I was missing this line therefore video was not playing
[player.moviePlayer setScalingMode:MPMovieScalingModeAspectFill];
[self.view addSubview:player.view];
[player.moviePlayer play];
There are several issues here:
For this type of usage (integrating the player into your view), you should be using MPMoviePlayerController, not MPMoviePlayerViewController. Use MPMoviePlayerViewController when you want to have a self-contained view controller which can be presented using presentMoviePlayerViewControllerAnimated:.
Assuming you are using ARC, the main problem is that nothing is keeping a reference to your player object. As a consequence, the player is disappearing shortly after you create it. You should keep a reference to it by assigning it to a property or instance variable of your view controller.
For a full example of this, see Till's excellent answer to a similar question.
I'm not sure what your intended purpose of the UIGraphicsBeginImageContext and UIGraphicsEndImageContext calls are, but I can't see that they're needed here.
As for shouldAutoplay = NO, the video is still starting because you are calling play immediately afterwards.
The player's duration property only contains a useful value after a MPMovieDurationAvailableNotification has been received. You'll need to do something similar to the following to have access to the actual duration:
__weak MediaPlayerController *weakSelf = self;
[[NSNotificationCenter defaultCenter] addObserverForName:MPMovieDurationAvailableNotification object:self.player queue:[NSOperationQueue mainQueue] usingBlock:^(NSNotification *note) {
NSLog(#"Movie duration: %lf", weakSelf.player.duration);
}];
Use removeObserver:name:object: to remove the observer when you are done.

Gamecenter Matchmaking Doesn't Work

Here is my Menu.m's onEnter method:
GKMatchRequest *request = [[[GKMatchRequest alloc] init] autorelease];
request.minPlayers = 2;
request.maxPlayers = 2;
GKMatchmakerViewController *mmvc = [[[GKMatchmakerViewController alloc] initWithMatchRequest:request] autorelease];
mmvc.matchmakerDelegate = self;
tempVC = [[UIViewController alloc] init];
[[[CCDirector sharedDirector] view] addSubview:tempVC.view];
[tempVC presentModalViewController: mmvc animated: NO];
mmvc.view.frame = CGRectMake(150, 150, 510, 420);
I tried with device and simulator but they couldn't match.
I also tried it to do by following Ray Wenderlich's tutorial:
But then, even matchmakerviewcontroller didn't show up. I don't know what I'm doing wrong.
Thanks in advance.
Are you using two different game center IDs -- one for your simulator and one for your device?
Is your device logged into the game center sandbox?
The simulator will use the game center sandbox, and your device must also use the sandbox for the two to connect.
You can log into the game center sandbox by logging out of the game center, and then starting your app. Your app should ask you to log into game center. Performing the login through your non-released app will put you in the game center sandbox.

Image filtering effects using BradLarson / GPUImage ** Issue

I want to add different effects to an UIImage. Currently I am trying with using GUIImage library. https://github.com/BradLarson/GPUImage
But, I am unable to run most of the examples provided. Only working example for me is FilterShowcase It works great.
But the problem is, its work only with camera. What i want is Load an static image on to a UIImageView and apply those filters.
I tried, but unable to do it.
This is how I tried it
- (void)setupFilter;
{
// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
//// videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionFront];
// videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
UIImageView *imgView = [[UIImageView alloc] initWithFrame:CGRectMake(0, 0, 320, 350)];
imgView.Image=[UIImage imageNamed:#"WID-small.jpg"];
[self.view addSubview:imgView];
BOOL needsSecondImage = NO;
switch (filterType)
{
case GPUIMAGE_SEPIA:
{
self.title = #"Sepia Tone";
self.filterSettingsSlider.hidden = NO;
[self.filterSettingsSlider setValue:1.0];
[self.filterSettingsSlider setMinimumValue:0.0];
[self.filterSettingsSlider setMaximumValue:1.0];
filter = [[GPUImageSepiaFilter alloc] init];
sourcePicture = [[GPUImagePicture alloc] initWithImage:imgView.image smoothlyScaleOutput:YES];
[sourcePicture processImage];
[sourcePicture addTarget:filter];
}; break;
Actually what I tried to do is, Load a still image instead of videoCamera. But it doesn't work.
If anyone can help me, its highly appreciated.
First, there are a couple of reasons why the other sample applications might not be building for you. As is stated in this answer, make sure that you have the scheme for the application selected in the upper left of your Xcode window, not the GPUImage framework project. If changing that doesn't help, exit out of Xcode, delete the relevant project directories from your DerivedData directory, and restart Xcode. That sometimes seems to be needed due to a bug in recent Xcode versions.
The filtering of an image is described in the documentation for the project, which is in the Readme.md file and on the page you link above. In particular, see the section titled "Processing a still image":
There are a couple of ways to process a still image and create a
result. The first way you can do this is by creating a still image
source object and manually creating a filter chain:
UIImage *inputImage = [UIImage imageNamed:#"Lambeau.jpg"];
GPUImagePicture *stillImageSource = [[GPUImagePicture alloc] initWithImage:inputImage];
GPUImageSepiaFilter *stillImageFilter = [[GPUImageSepiaFilter alloc] init];
[stillImageSource addTarget:stillImageFilter];
[stillImageSource processImage];
UIImage *currentFilteredVideoFrame = [stillImageFilter imageFromCurrentlyProcessedOutput];
For single filters that you wish to apply to an image, you can simply
do the following:
GPUImageSepiaFilter *stillImageFilter2 = [[GPUImageSepiaFilter alloc] init];
UIImage *quickFilteredImage = [stillImageFilter2 imageByFilteringImage:inputImage];
The SimpleImageFilter example application shows how to do this in more detail.