I'm currently using an AVPlayer, along with an AVPlayerLayer to play back some video. While playing back the video, I've registered for time updates every 30th of a second during the video. This is used to draw a graph of the acceleration at that point in the video, and have it update along with the video. The graph is using the CMTime from the video, so if I skip to a different portion of the video, the graph immediately represents that point in time in the video with no extra work.
Anywho, as far as I'm aware, if I want to get an interface similar to what the MediaPlayer framework offers, I'm going to have to do that myself.
What I'm wondering is, is there a way to use my AVPlayer with the MediaPlayer framework? (Not that I can see.) Or, is there a way to register for incremental time updates with the MediaPlayer framework.
My code, if anyone is interested, follows :
[moviePlayer addPeriodicTimeObserverForInterval: CMTimeMake(1, 30) queue: dispatch_queue_create("eventQueue", NULL) usingBlock: ^(CMTime time) {
loopCount = (int)(CMTimeGetSeconds(time) * 30);
if(loopCount < [dataPointArray count]) {
dispatch_sync(dispatch_get_main_queue(), ^{
[graphLayer setNeedsDisplay];
});
}
}];
Thanks!
If you're talking about the window chrome displayed by MPMoviePlayer then I'm afraid you are looking at creating this UI yourself.
AFAIK there is no way of achieving the timing behaviour you need using the MediaPlayer framework, which is very much a simple "play some media" framework. You're doing the right thing by using AVFoundation.
Which leaves you needing to create the UI yourself. My suggestion would be to start with a XIB file to create the general layout; toolbar at the top with a done button, a large view that represents a custom playback view (using your AVPlayerLayer) and a separate view to contain your controls.
You'll need to write some custom controller code to automatically show/hide the playback controls and toolbar as needed if you want to simulate the MPMoviePlayer UI.
You can use https://bitbucket.org/brentsimmons/ngmovieplayer as a starting point (if it existed at the time you asked).
From the project page: "Replicates much of the behavior of MPMoviePlayerViewController -- but uses AVFoundation."
You might want to look for AVSynchronizedLayer class. I don't think there's a lot in the official programming guide. You can find bits of info here and there: subfurther, Otter Software.
In O'Really Programming iOS 4 (or 5) there's also a short reference on how to let a square move/stop along a line in synch with the animation.
Another demo (not a lot of code) is shown during WWDC 2011 session Working with Media in AV Foundation.
Related
I'm working on an iOS movie editor project. For this editor, i use MPMoviePlayer to show the video file selected by the user.
I use custom controls, and I have a UISlider that enables the user to move the player's currentTime position. When the user touches the slider, movie is paused and its currentTime changes along with the UISlider's value.
Everything works perfectly, but now i need to let the user hear the sound at this currentTime position.
For those who know iMovie, when you move your mouse over a movie event, you see the image and hear the sound at this position, and that's what i'd like in my editor.
I've tried to call player's play method with à NSTimer to stop after 0.2 seconds, but the result is kind of messy.
Has anyone already achieved to do something like this ?
Thanks !
Best regards.
Seeking takes time; that's why you've ended up using a timer. The real problem here is that MPMoviePlayerController, while convenient because it gives you controls, is a blunt instrument; it's just a massive simplified convenience built on top of AVFoundation. But you don't need the built-in controls, so I would suggest throwing away your entire implementation and getting down to the real stuff, using AVFoundation instead (AVPlayer etc). Now you have a coherent way to seek and get a notification when the seek has completed (seekToTime:completionHandler:), so you'll be able to start playing as soon as possible. Plus, AVFoundation is the level where you'll be doing all your "editing" anyway.
i need to achieve an animation effect like (the Effects in "Pic Something","Pic Reveal" and so on) in my app.
What i am saying is i need to implement this tasks
Task1: when the user touches one Letter, then it change its frame(current position) to another frame(target position).
Task2:when the user touch the Letter(in Target position), it comes back to its original position again.
this can be clearly understood if u see the sample Apps.
I didn't find out any samples on internet also.
Thanks in Advance..
Take a look at UIView animation and animation blocks in iOS, that's what you need. With them you can create any animation you like. Here's a nice tutorial.
And about the whole system you described - I would create an NSDictionary of UIView positions and attach those to the corresponding tags of UIViews- this way you will always know from which place every UIView came from.
I'm trying to capture one or more UIImages programmatically using AVFoundation.
I set up the sessions and input devices and everything, but when I try to find explanations on how to actually take the photos, all I get is buffeled information about connections and what not.
I couldn't find a single example of actually taking photos and saving it to UIImage for further processing. All the example use a constant kCGImagePropertyExifDictionary Which doesn't seems to exist in iOS 5 SDK..
Can someone please provide me with a code or an explanation from top to bottom on how to take and save an image from the front facing camera to a UIImage using AVFoundation?
Thanks alot!
To use kCGImagePropertyExifDictionary, you should #import <ImageIO/ImageIO.h>.
All of the other information you seek is inside the AVFoundation Programming guide - particularly the Media Capture section.
I have a situation where I'd like to play 2 video clips back to back using an MPMoviePlayerViewController displayed using presentMoviePlayerViewControllerAnimated.
The problem is that the modal view automatically closes itself as soon as the first movie is complete.
Has anyone found a way to do this?
Three options:
You may use MPMoviePlayerController and start the playback of the 2nd (Nth) item after the previous is complete. This however will introduce a small gap between the videos cause by identification and pre buffering of the content.
You may use AVQueuePlayer; AVQueuePlayer is a subclass of AVPlayer you use to play a number of items in sequence. See its reference for more.
You may use AVComposition for at runtime composing one video out of the two (or N) you need to play back. Note, this works only on locally stored videos and not on remote (streaming or progressive download). Then use AVPlayer for the playback.
It's not possible. If the video assets are in local file system, consider AVComposition.
One of the ways to improve user experience in iOS while showing images is to download them asynchronously without blocking the main thread and showing them....
But I want to add something to this -
Initially when there is no image,show a spinner while the async download has started.
After the download cache the image on local iOS disk for later use.
After the download populate the image part of UIImageView.
And dont just plonk the image into view for user. Showly Fade in the user (i.e. from alpha 0.0 to 1.0)
I have been using SDWebImage for sometime now. It works well but does not satisfy my 1st requirement (about spinner) and 4th.
Is there any help out there to satisfy all this?
Three20 http://www.three20.info has a TTImageView class that statisfies 2-3, you can subclass it and overwrite setImage: and create the fade animation there. (or just modify TTImageView.m directly).
Spinner is easy as well when you modify TTImageView you can add a TTActivityView on top and remove it on setImage: