I would like to open a mp4 web asset in an ios app to a specific set of time offsets, like chapters for example, 2:31 in to a 10 minute video
any ideas on how to accomplish this?
Use an AVAssetReader and set a time range before starting to read frames:
CMTimeRange timeRange = CMTimeRangeMake(startTime, kCMTimePositiveInfinity);
assetReader.timeRange = timeRange;
Presumably you have an MPMoviePlayerController which you're already using to play the movie, I suggest looking at the initialPlaybackTime and endPlaybackTime properties on the controller to set the start and end point for movie playback.
Related
I have been working on merging of videos into a single video using the AVMutableComposition and have got the required output.But i'm getting two different types of issues while merging the videos.
1) I couldn't set the custom frames for the videos using the AVMutableVideoCompositionLayerInstruction.
2) While playing the merged video, the first merged video is been vanished(removed) after its been stopped. But the other videos are staying correctly after stopped in their respective location.
Please suggestion any solution..
1) if you want to change frame for merged video, you should set AVMutableVideoComposition render size
2) if you play first and the second video at the same time, the duration of the merged video will be the biggest duration from both videos, in that case you should handle the difference,place a kind of placeholder at the end of first video(can be last frame of the video)
I would like to place a CCLabelBMFont object over a playing video. How can I achieve this? I am using the MPMoviePlayerController to play the video. Also, can I specify the duration, start time and end time that the label will appear?
I'm trying to capture one or more UIImages programmatically using AVFoundation.
I set up the sessions and input devices and everything, but when I try to find explanations on how to actually take the photos, all I get is buffeled information about connections and what not.
I couldn't find a single example of actually taking photos and saving it to UIImage for further processing. All the example use a constant kCGImagePropertyExifDictionary Which doesn't seems to exist in iOS 5 SDK..
Can someone please provide me with a code or an explanation from top to bottom on how to take and save an image from the front facing camera to a UIImage using AVFoundation?
Thanks alot!
To use kCGImagePropertyExifDictionary, you should #import <ImageIO/ImageIO.h>.
All of the other information you seek is inside the AVFoundation Programming guide - particularly the Media Capture section.
I have a situation where I'd like to play 2 video clips back to back using an MPMoviePlayerViewController displayed using presentMoviePlayerViewControllerAnimated.
The problem is that the modal view automatically closes itself as soon as the first movie is complete.
Has anyone found a way to do this?
Three options:
You may use MPMoviePlayerController and start the playback of the 2nd (Nth) item after the previous is complete. This however will introduce a small gap between the videos cause by identification and pre buffering of the content.
You may use AVQueuePlayer; AVQueuePlayer is a subclass of AVPlayer you use to play a number of items in sequence. See its reference for more.
You may use AVComposition for at runtime composing one video out of the two (or N) you need to play back. Note, this works only on locally stored videos and not on remote (streaming or progressive download). Then use AVPlayer for the playback.
It's not possible. If the video assets are in local file system, consider AVComposition.
I'm currently using an AVPlayer, along with an AVPlayerLayer to play back some video. While playing back the video, I've registered for time updates every 30th of a second during the video. This is used to draw a graph of the acceleration at that point in the video, and have it update along with the video. The graph is using the CMTime from the video, so if I skip to a different portion of the video, the graph immediately represents that point in time in the video with no extra work.
Anywho, as far as I'm aware, if I want to get an interface similar to what the MediaPlayer framework offers, I'm going to have to do that myself.
What I'm wondering is, is there a way to use my AVPlayer with the MediaPlayer framework? (Not that I can see.) Or, is there a way to register for incremental time updates with the MediaPlayer framework.
My code, if anyone is interested, follows :
[moviePlayer addPeriodicTimeObserverForInterval: CMTimeMake(1, 30) queue: dispatch_queue_create("eventQueue", NULL) usingBlock: ^(CMTime time) {
loopCount = (int)(CMTimeGetSeconds(time) * 30);
if(loopCount < [dataPointArray count]) {
dispatch_sync(dispatch_get_main_queue(), ^{
[graphLayer setNeedsDisplay];
});
}
}];
Thanks!
If you're talking about the window chrome displayed by MPMoviePlayer then I'm afraid you are looking at creating this UI yourself.
AFAIK there is no way of achieving the timing behaviour you need using the MediaPlayer framework, which is very much a simple "play some media" framework. You're doing the right thing by using AVFoundation.
Which leaves you needing to create the UI yourself. My suggestion would be to start with a XIB file to create the general layout; toolbar at the top with a done button, a large view that represents a custom playback view (using your AVPlayerLayer) and a separate view to contain your controls.
You'll need to write some custom controller code to automatically show/hide the playback controls and toolbar as needed if you want to simulate the MPMoviePlayer UI.
You can use https://bitbucket.org/brentsimmons/ngmovieplayer as a starting point (if it existed at the time you asked).
From the project page: "Replicates much of the behavior of MPMoviePlayerViewController -- but uses AVFoundation."
You might want to look for AVSynchronizedLayer class. I don't think there's a lot in the official programming guide. You can find bits of info here and there: subfurther, Otter Software.
In O'Really Programming iOS 4 (or 5) there's also a short reference on how to let a square move/stop along a line in synch with the animation.
Another demo (not a lot of code) is shown during WWDC 2011 session Working with Media in AV Foundation.