Is it possible play multiple clips using presentMoviePlayerViewControllerAnimated? - objective-c

I have a situation where I'd like to play 2 video clips back to back using an MPMoviePlayerViewController displayed using presentMoviePlayerViewControllerAnimated.
The problem is that the modal view automatically closes itself as soon as the first movie is complete.
Has anyone found a way to do this?

Three options:
You may use MPMoviePlayerController and start the playback of the 2nd (Nth) item after the previous is complete. This however will introduce a small gap between the videos cause by identification and pre buffering of the content.
You may use AVQueuePlayer; AVQueuePlayer is a subclass of AVPlayer you use to play a number of items in sequence. See its reference for more.
You may use AVComposition for at runtime composing one video out of the two (or N) you need to play back. Note, this works only on locally stored videos and not on remote (streaming or progressive download). Then use AVPlayer for the playback.

It's not possible. If the video assets are in local file system, consider AVComposition.

Related

MPMoviePlayerController audio/video out of sync

In my iPad App, I create an MPMoviePlayerController that plays an MP4 off of a website. In the MP4 there are people talking. In the App, I have noticed that the audio and video are out of sync by half a second or a quarter-second, perhaps. (I can tell this because I view the video in a web browser and there is no lag.)
The only clue that I have to this is that when the MPMoviePlayerController first loads up, the audio starts playing, but the video doesn't--then the video starts playing and it seems to skip a couple of frames to "catch up" to the audio...but it doesn't quite sync.
Seeing as how this class is a "black box" per the Apple Documentation, and none of the existing methods or properties come across as helpful to this problem, I'm a bit stumped. I may have to leave it how it is with the slight lag, rather than face weird workarounds. I wanted to see if anyone has experienced this before or could suggest a solution.
I'm running on Mountain Lion, latest XCode, iPad 2 with iOS 6.
The code I use to generate the controller is:
-(void) startVideoPlaying {
if (!self.theMoviePlayer) {
self.theMoviePlayer = [[MPMoviePlayerController alloc] initWithContentURL: movieURL];
[self.bgView addSubview: self.theMoviePlayer.view];
[self.theMoviePlayer.view setFrame: movieContainer.frame]; // player's frame must match parent's
} else {
[self.theMoviePlayer setContentURL:movieURL];
}
self.theMoviePlayer.shouldAutoplay = YES;
[self.theMoviePlayer play];
}
Thanks for any help.
After a variety of tests, I must conclude that there is something wrong with the encoding on the MP4s that I have been playing. I am not a video codec guru, but I made the following tests, which tell me this:
1) Downloading the MP4, placing it into the app and loading it into the MPMoviePlayerController via a file URL. Audio still out of sync, so not a connectivity issue.
2) Finding another MP4 on the web (something off Vimeo) and streaming it into the Player. Audio synced properly, potentially something wrong with the MP4s I was attempting to play.
3) Downloading the MP4, and using Handbrake to convert it into M4V with a variety of different settings (including the iPad preset). The Audio was synced fine.
Based on this, it seems to me like there's something wrong with the file I was attempting to play rather than the player (or the player can't handle it). Unfortunately, the files I am attempting to play cannot simply be converted, they are part of a large website system and many hundreds of files would have to change, and so on. So, while I have answered my own question, I haven't solved the problem.

IOS Segemnting a video

I want to segment a video into 9 sections jumble the sections about and then play them together. I am currently trying to use AVPlayer to do this. I can get the sections to load on the simulator but only the first 4 will load on the actual phone. I guess the resources are topping out.
My question is: Is AVPlayer the best framework to use for this (as it seems a waste to create a player item, player and playerlayer for each segment(Which I think is why the resources are topping out)) or is there a lower framework I can use to load one video and display certain segments of it on certain sreas of the screen?
Thanks

AVPlayer seekToTime: Poor performance while looping short videos

this is my first question, but I've readed StackOverflow for years.
Well the thing is that I must concatenate a set of videos (MP4) in a particular order, and the last one must loop continuously. Okay, let's try AVFoundation.
I've defined all the elements such as AVComposition, AVURLAsset and AVPlayerItem. Then I build the AVPlayer and the AVPlayerLayer. Everything is okay and the videos are played in the correct order but... (and here comes the problem)
I can note a tiny flick when the AVPlayer passes from one video to the next one. I can ignore that one, but when AVPlayer reaches AVPlayerItem's end, and the selector is called for AVPlayerItemDidPlayToEndTimeNotification, I use seekToTime to move the reading head to the correct and then play it again. Works fine, but a GREAT flick is repeated when using seekToTime and playing again. I cannot allow that.
Does anybody know how to repeat the last asset on AVPlayerItem continuously and without flicks?
Thank you all.
Absolutely stupid,
Friends, when making video manipulation apps be sure that the videos don't have annoying black frames. That was the root of all problems. [AVPlayer seekToTime:] works perfectly.
Thanks for your time and patience

Playing Audio and Video of a mp4 file separately using AVFoundation Framework

I have developed a media player using AVFoundation for iOS. I am using AVPlayer to play the audio-video files (eg. a mp4 file). It seems quiet simple to play the file using AVPlayer, by directly calling the play, pause APIs.
Now I want to separate the audio and video and play them as individual entities simultaneously. I want to do this because, I may do some editing to the audio or video track, and then play the file.
I can separate the two using AVAssetTracks, but I dont know how to play the tracks. Also, I would like to play the two tracks simultaneously, so that no AVSync problem occurs.
Please guide me how to achieve this target, i.e. audio and video rendering with no AVSync problem.
Thanks..
The easiest way to achieve this would be to have multiple player items. I would create a playerItem with all the tracks in their original form (ie the mp4 file). Then create another asset using AVMutableComposition (a subclass of AVAsset). This allows you to put only certain tracks into the composition (ie the audio track only). When you want to play the audio only, play the playerItem (AVPLayer setCurrentItem:) that has the mutable composition with the audio only track. When you want to play the video only, play the playerItem that has the mutable composition with the video only track. When you want to play both in sync, play the playerItem with the original asset.
I'm assuming you want to play the edited versions in sync. For that you will have to create another AVMutableComposition and add all of the edited tracks. Then call setCurrentItem: with the newly created AVMutableComposition.
If all you are trying to do is edit the different tracks, and never have to play them by themselves, you can do this with one single AVMutableComposition. Just add an audio and video AVMutableCompositionTrack and edit until your hearts content. They will always play in sync no matter how much you edit them separately (assuming your editing logic is correct). Just make sure you don't try to edit while playing. For that, you must create a copy and edit the copy.

Best way to export a QTMovie with a fade-in and fade-out in the audio

I want to take a QTMovie that I have and export it with the audio fading in and fading out for a predetermined amount of time. I want to do this within Cocoa as much as possible. The movie will likely only have audio in it. My research has turned up a couple of possibilities:
Use the newer Audio Context Insert APIs. http://developer.apple.com/DOCUMENTATION/QuickTime/Conceptual/QT7-2_Update_Guide/NewFeaturesChangesEnhancements/chapter_2_section_11.html. This appears to be the most modern was to accomplish this.
Use the Quicktime audio extraction APIs to pull out the audio track of the movie and process it and then put the processed audio back into the movie replacing the original audio.
Am I missing some much easier method?
Quicktime has the notion of Tween Tracks. A tween track is a track that allows you to modify the properties of another set of tracks properties (such as the volume).
See Creating a Tween Track in the Quicktime docs to see an example of how to do this with an Quicktime audio track's volume.
There is also a more complete example called qtsndtween on the Apple Developer website.
Of course, all of this code requires using the Quicktime C APIs. If you can live with building a 32-bit only application, you can get the underlying Quicktime-C handles from a QTMovie, QTTrack, or QTMedia object using the "movie", "track", or "media" functions respectively.
Hopefully we'll get all the features of the Quicktime C APIs in the next version of QTKit, whenever that may be.