iPad video loading issue - objective-c

I am developing a swipe based application. On each swipe a page will appear containing a video with some other asset. I am using the following code to do this for me on each swipe.
NSArray *file = [videoFile componentsSeparatedByCharactersInSet:[NSCharacterSet characterSetWithCharactersInString:#"."]];
NSString *moviePath = [[NSBundle mainBundle] pathForResource:[file objectAtIndex:0] ofType:[file objectAtIndex:1]];
if (moviePath != nil)
{
//self.videoTimer = [NSTimer scheduledTimerWithTimeInterval:delayTime target:self selector:#selector(playVideo:) userInfo:nil repeats:NO];
self.theMovie = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL fileURLWithPath:moviePath]];
self.theMovie.view.frame = self.bounds;
self.theMovie.moviePlayer.scalingMode = MPMovieScalingModeAspectFit;
self.theMovie.moviePlayer.controlStyle = MPMovieControlStyleNone;
self.theMovie.moviePlayer.movieSourceType = MPMovieSourceTypeFile;
[self addSubview:self.theMovie.view];
[self.theMovie release];
}
else
{
[AssetValidator alertMissingFileInfo:videoFile];
}
This works fine.
My problem is that each time I swipe video starts with a delay and black screen.
Please guide how to solve this problem.
Regards.

How big are the movies? If their size is big then it's OK. There is a solution but it's quite irrational - load all the movies to memory during app startup and runtime. Otherwise (if the movies are quite big), the delay will happen anyway.

We can use a background Image to avoid black screen.

Related

MPNowPlayingInfoCenter doesn't play on lock screen

I'm using MPMoviePlayerController to play audio/video and would like to continue playing when the app backgrounds and lock screen comes on. I'm setting MPNowPlayingInfoCenter's nowPlayingInfo in viewDidLoad and again on UIApplicationWillResignActiveNotification1 notification with the following:
- (void)appWillResignActive {
NSMutableDictionary *nowPlayingInfo = [[NSMutableDictionary alloc] init];
[nowPlayingInfo setObject:_session[#"session"][#"title"] forKey:MPMediaItemPropertyTitle];
[nowPlayingInfo setObject:_session[#"session"][#"author"] forKey:MPMediaItemPropertyArtist];
[nowPlayingInfo setObject:_session[#"session"][#"album"] forKey:MPMediaItemPropertyAlbumTitle];
[nowPlayingInfo setObject:[NSNumber numberWithDouble:_videoController.playableDuration] forKey:MPMediaItemPropertyPlaybackDuration];
[nowPlayingInfo setObject:#(1.0f) forKey:MPNowPlayingInfoPropertyPlaybackRate];
[nowPlayingInfo setObject:[NSNumber numberWithDouble:_videoController.currentPlaybackTime] forKey:MPNowPlayingInfoPropertyElapsedPlaybackTime];
if (_session[#"session"][#"album_art"] != nil) {
UIImage *albumArtImage = [UIImage imageNamed:_session[#"session"][#"album_art"]];
MPMediaItemArtwork *albumArt = [[MPMediaItemArtwork alloc] initWithImage:albumArtImage];
[nowPlayingInfo setObject:albumArt forKey:MPMediaItemPropertyArtwork];
}
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:nowPlayingInfo];
}
The lock screen shows the current info, but it's not playing at all. The elapsed time seems about right, but the total playable time is incorrect. Pressing play doesn't start it either. I'm not sure what's wrong, any thoughts?

iPad capturing 16:9 photos

I am building a prototype app on iOS, and I’m cannibalizing some Apple sample code to do it (thin ice, I know—this code uses goto statements :\ ). I am using the AVCam project from Session 520 - What's New in Camera Capture. I don’t need video capture capability, just still photos.
The device inputs and outputs are set up thusly:
// Init the device inputs
AVCaptureDeviceInput *newVideoInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self backFacingCamera] error:nil];
AVCaptureDeviceInput *newAudioInput = [[AVCaptureDeviceInput alloc] initWithDevice:[self audioDevice] error:nil];
// Setup the still image file output
AVCaptureStillImageOutput *newStillImageOutput = [[AVCaptureStillImageOutput alloc] init];
NSDictionary *outputSettings = #{AVVideoCodecKey: AVVideoCodecJPEG};
[newStillImageOutput setOutputSettings:outputSettings];
// Create session (use default AVCaptureSessionPresetHigh)
AVCaptureSession *newCaptureSession = [[AVCaptureSession alloc] init];
// Add inputs and output to the capture session
if ([newCaptureSession canAddInput:newVideoInput]) {
[newCaptureSession addInput:newVideoInput];
}
if ([newCaptureSession canAddInput:newAudioInput]) {
[newCaptureSession addInput:newAudioInput];
}
if ([newCaptureSession canAddOutput:newStillImageOutput]) {
[newCaptureSession addOutput:newStillImageOutput];
}
[self setStillImageOutput:newStillImageOutput];
[self setVideoInput:newVideoInput];
[self setAudioInput:newAudioInput];
[self setSession:newCaptureSession];
And here is the method that’s called when I tap the shutter button:
- (void) captureStillImage
{
AVCaptureConnection *stillImageConnection = [[self stillImageOutput] connectionWithMediaType:AVMediaTypeVideo];
if ([stillImageConnection isVideoOrientationSupported])
[stillImageConnection setVideoOrientation:orientation];
[[self stillImageOutput]
captureStillImageAsynchronouslyFromConnection:stillImageConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
ALAssetsLibraryWriteImageCompletionBlock completionBlock = ^(NSURL *assetURL, NSError *error) {
if (error)
{
if ([[self delegate] respondsToSelector:#selector(captureManager:didFailWithError:)])
{
[[self delegate] captureManager:self didFailWithError:error];
}
}
};
if (imageDataSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
UIImage *image = [[UIImage alloc] initWithData:imageData];
if ([self.delegate respondsToSelector:#selector(captureManagerCapturedImage:)])
{
dispatch_async(dispatch_get_main_queue(), ^{
[self.delegate captureManagerCapturedImage:image];
});
}
[library writeImageToSavedPhotosAlbum:[image CGImage]
orientation:(ALAssetOrientation)[image imageOrientation]
completionBlock:completionBlock];
}
else
{
completionBlock(nil, error);
}
if ([[self delegate] respondsToSelector:#selector(captureManagerStillImageCaptured:)])
{
[[self delegate] captureManagerStillImageCaptured:self];
}
}];
}
This code successfully captures an image and saves it to the library. However, at some point while I was working on it, it changed from capturing 5-megapixel 4:3 images to capturing 1920x1080 16:9 images. I can’t find anywhere that the aspect ratio is specified, and I didn’t change any of the code relating to the configuration of the camera, capture sessions, or capture connection. Why did my camera start taking 16:9 photos?
Update: I just re-ran Apple’s original sample code, and it appears that it is also saving 16:9 images captured directly from the video. It is quite possible that I was insane before, or I took a test shot with Camera.app and was looking at that. So my real question is, how do I show a live feed from the camera on the screen while I’m shooting, and take a full-resolution photo. I can’t use UIImagePickerController, because I need to be able to overlay things on top of the live camera feed.
Update 2: I was able to solve this by throwing out the AVCapture code I was using. It turns out that UIImagePickerController does what I needed. I didn’t realize you could overlay custom controls - I thought it took over the whole screen until you were done taking a picture.
If you're capturing frames from a video source, you'll end up with a resolution of 16:9. Capturing frames from a video source and taking photos are different things.

iOS AVAudioPlayer double sound playing issue with Xcode 4.2 and ARC

Ive been beating my head up against the all trying to figure out a weird sound issue in my program. I will post the code below. As a brief description I have a function that gets passed a variable and it takes that and decides which category of sound it should play, within that function I have multiple sounds of reach category so I use an arc4random statement then run that through a switch statement. What keeps happening is that it will play two sounds from the case statement instead of just one, most times if I call it twice with my button push the second time it will play the sound from the first time and a new sound, other times it plays the same same over the top of each other but with a slight delay. I put in a breakpoint in the switch and when it double plays it only goes through the switch once which is really confusing. One thing to note is right before this I do play another sound but it uses a separate AVAudioplayer and path variable so that shouldn't be an issue as it never plays the second sound as the other sound I'm playing. I'm only calling the function once when I press the button so I'm not sure why it does this. I have tried putting the *path and *avaudioplayer variables inside the function but it won't play at all. Searching here it seems as though arc deallocs it before it gets a chance to play it. I ended up trying to put it at the top of my .m file as a global variable then just set the actual path and play the sound within the switch. The sound will play but it plays twice.. Hopefully someone can help me out. I tried putting the avaudioplayer definition as a property as well and it does the same thing. Here is my code snippet. And thanks in advance ...
in the .m file
// Just below my synthesize statements
NSString *path;
AVAudioPlayer *theSound
// My code that I call when the button is pressed
[self playVoice:#"buyVoice"];
// The playVoice function
- (void)playVoice:(NSString*)voiceNum;
if ([voiceNum isEqualToString:#"buyVoice"]) // Bought an Item Time for a voice
{
// play coin sound
[self coinSound];
// play random buy phrase and coin sound
int phraseNumber = 0;
phraseNumber = arc4random() %(3);
switch (phraseNumber)
{
case 0:
{
path = [[NSBundle mainBundle] pathForResource:#"sndBuy1" ofType:#"m4a"];
theSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath: path] error:NULL];
[theSound setNumberOfLoops:0];
[theSound play];
break;
}
case 1:
{
path = [[NSBundle mainBundle] pathForResource:#"sndBuy2" ofType:#"m4a"];
theSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath: path] error:NULL];
[theSound setNumberOfLoops:0];
[theSound play];
break;
}
case 2:
{
path = [[NSBundle mainBundle] pathForResource:#"sndBuy3" ofType:#"m4a"];
theSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath: path] error:NULL];
[theSound setNumberOfLoops:0];
[theSound play];
break;
}
case 3:
{
path = [[NSBundle mainBundle] pathForResource:#"sndBuy4" ofType:#"m4a"];
theSound =[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath: path] error:NULL];
[theSound setNumberOfLoops:0];
[theSound play];
break;
}
}
}
set this code before u play the Audio sound
This will reset the player to the beginning:
[theSound setCurrentTime:0.0];
[theSound play];
try this

iOS load an image based on user input

fairly new to Objective-C and iOS development (coming from PHP) and I have a relatively simple question that I can't seem to find an answer to:
I am following along with an example for split View design where a web page is loaded into the Detail View when a user clicks an item in the master view. I got all this working, but would like to substitute web view for an image. So I've amended the app to load a UIImage instead of a WebView. What I'm looking for is the equivalent to this code:
NSString *urlString = [pagesAddress objectAtIndex:indexPath.row];
NSURL *url = [NSURL URLWithString:urlString];
// these 2 is where I get lost with the images.
NSURLRequest = *request = [NSURLRequest requestWithURL:url];
[detailViewController.webView loadRequest:request];
I came up with this:
NSString *imageName = [pagesAddress objectAtIndex:indexPath.row];
UIImage *myImage = [UIImage imageNamed:imageName];
// missing the last 2 calls: one to tell Xcode that it's an image "request" I want and the second to load the actual image (based on it's name that is already in an array) into the ImageView.
Thanks.
PS
I tried this:
NSString *imageName = [pagesAddress objectAtIndex:indexPath .row];
[detailViewController.imageView setImage:[UIImage imageNamed:imageName]];
And it shows just the first image, then crashes when I try to show the last one.
In the end, the solution were those 2 lines when I amended the code:
NSString *imageName = [pagesAddress objectAtIndex:indexPath.row];
[detailViewController.imageView setImage:[UIImage imageNamed:imageName]];
Notice that I had to change the setImage to convert the NSString to a UIImage or Xcode would complain. It turns out it was crashing because in the array where I had the image names, I had put 3 images into one entry (basically I forgot the commas!) so it was out of range.
Tim:
This line you gave me
UIImageView *imageView = [[UIImageView alloc] initWithFrame:self.view.bounds];
is unnecessary because I already have a view created, it would create another view which I never used. Also, replacing it with CGRect seems overkill if I already have a UIImage placeholder no?
In any case, it works now and I'm very grateful for all the help. iPad development with Objectve-C is a very thorny road and I expect I'll be bugging you guys some more.
Cheers.
Try this:
UIImage *myImage = [[UIImage alloc] initWithData:[NSData dataWithConentsOfURL:[NSURL URLWithString:urlString];
// don't know if you already got the following?
UIImageView *imageView = [[UIImageView alloc] initWithFrame:self.view.bounds];
[imageView setImage:myImage];
[self.view addSubview:imageView];
The first line is synchronous (= blocking), so in production, you should rather use - [NSURLRequest start] for this (but that's a bit more complicated).
Or use this for your local images:
UIImage *myImage = [UIImage imageNamed:imageName];
// Now, follow the same steps as in the first code-example, just skip the first line.
Try this (on iOS 4.0 and later):
// Execute a block of code on a background thread.
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0),
^(void)
{
NSAutoreleasePool* pool = [[NSAutoreleasePool alloc] init];
UIImage* image = [UIImage imageWithData:[NSData dataWithContentsOfURL:url]];
// When IO is done and image created, set it on the main thread.
dispatch_async(dispatch_get_main_queue(),
^(void)
{
imageView.image = image;
});
[pool release];
});

Help on Objective C? My app keeps crashing?

I am creating an app for the Iphone/Itouch, but I keep on running into a couple of major leaks that just crash the app. In my game, right when I press play (from the home screen)It goes to another page that has the game on it. But, right after it appears, ind the console I get a Warning: Memory level=1. What could be happening? Here is my ViewDidLoad Method:
-(void)viewDidLoad {
[super viewDidLoad];
array = [[NSMutableArray alloc] initWithCapacity:100];
bulletArray = [[NSMutableArray alloc] initWithCapacity:100];
pos = CGPointMake(0.0,-5.0);
NSString *path = [[NSBundle mainBundle] pathForResource:#"GloriousMorning" ofType:#"mp3"];
AVAudioPlayer *theAudio=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
//float effects_Volume = [[NSUserDefaults standardUserDefaults] floatForKey:#"effectsVolume"];
//theAudio.volume = effects_Volume;
[theAudio play];
}
And also, a second question, since my game is a shooting game, the user presses a button titled "Fire". But, every time I test my app on a device, It crashes when I press the fire button. Here is my code for the fire button.
-(IBAction)Fire {
NSString *path = [[NSBundle mainBundle] pathForResource:#"gunShot" ofType:#"mp3"];
AVAudioPlayer *theAudio=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
//float effects_Volume = [[NSUserDefaults standardUserDefaults] floatForKey:#"effectsVolume"];
//theAudio.volume = effects_Volume;
[theAudio play];
//IBOutlet UIImageView *newBullet = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Bullet.png"]];
UIImageView *newBullet = [[UIImageView alloc] initWithImage:[UIImage imageNamed:#"Bullet.png"]];
newBullet.frame = CGRectMake(0, 0, 10.0, 10.0);
newBullet.center = CGPointMake(239.0, 236.0);
[bulletArray addObject:newBullet];
[self.view addSubview:newBullet];
}
First, I create a sound. Then, I place a bullet right where the gun is currently located, and add it to an array so that every .01 of a second, in another bit of code, I can run through the array and check every bullet to detect collision.
Please tell me what I am doing wrong. Thanks!!!
The error when I click the Fire Button that makes the app crash is this:
GDB: Data Formatters temporarily unavailable, will retry after a 'continue'(unknown error loading shared library "
And Also I think I am making a huge leak when I try to play the audio, at least that's what someone told me. (If that is the case, please tell me how to fix it)
You alloc'd theAudio so you need to release it when you are done. Might be better to make that a ivar so you don't have to set up and tear down the audio every time they press the fire button.