Best way to export a QTMovie with a fade-in and fade-out in the audio - objective-c

I want to take a QTMovie that I have and export it with the audio fading in and fading out for a predetermined amount of time. I want to do this within Cocoa as much as possible. The movie will likely only have audio in it. My research has turned up a couple of possibilities:
Use the newer Audio Context Insert APIs. http://developer.apple.com/DOCUMENTATION/QuickTime/Conceptual/QT7-2_Update_Guide/NewFeaturesChangesEnhancements/chapter_2_section_11.html. This appears to be the most modern was to accomplish this.
Use the Quicktime audio extraction APIs to pull out the audio track of the movie and process it and then put the processed audio back into the movie replacing the original audio.
Am I missing some much easier method?

Quicktime has the notion of Tween Tracks. A tween track is a track that allows you to modify the properties of another set of tracks properties (such as the volume).
See Creating a Tween Track in the Quicktime docs to see an example of how to do this with an Quicktime audio track's volume.
There is also a more complete example called qtsndtween on the Apple Developer website.
Of course, all of this code requires using the Quicktime C APIs. If you can live with building a 32-bit only application, you can get the underlying Quicktime-C handles from a QTMovie, QTTrack, or QTMedia object using the "movie", "track", or "media" functions respectively.
Hopefully we'll get all the features of the Quicktime C APIs in the next version of QTKit, whenever that may be.

Related

Show subtitles with an AVFoundation AVPlayer on OS X

I'm trying to display subtitles when playing video using AVFoundation on OS X.
I've looked through the documentation and I can't find a way to enable a subtitle track. The API contains multiple references to subtitle tracks, which leads me to believe that it's supported.
On iOS the method -[AVPlayerItem selectMediaOption:inMediaSelectionGroup:] is used to enable subtitle tracks. This method isn't available in the 10.7 SDK. Is there another way to show subtitles?
EDIT:
Quicktime Player X has subtitle support, for example on opening this movie the subtitle submenu appears to offer a choice of language, and will display them when English is chosen. This leads me to believe that they're included in the API...
I ran into this same issue myself. I found that unfortunately the only way to do it, other than switching to QTKit, is to make a separate subtitles layer (a CATextLayer) and position it appropriately as a sublayer to the player layer. The idea is that you set up a periodic time observer to trigger every second or so and update the subtitles, along with (and this is optional) some UI element you might have that shows what the elapsed time is in the video.
I created a basic SubRip (.srt) file parser class; you can find it here: https://github.com/sstigler/SubRip-for-Mac . Be sure to check the wiki for documentation. The class is available under the terms of the BSD license.
Another challenge you might run into is how to dynamically adjust the height of the CATextLayer to account for varying lengths of subtitles, and varying widths of the containing view (if you choose to make it user-resizable). I found a great CALayoutManager subclass that does this, and made some revisions to it in order to get it to work for what I was trying to do: https://github.com/sstigler/height-for-width .
I hope this helps.
The way to add the subtitle file is by adding scc subtitle as AVMediaTypeClosedCaption track to the AVPlayer using AVMutableCompositionTrack and the player will control it I tried it and it worked or by adding a text file with the quick time text or any text as AVMediaTypeText track and the player will show the subtitle I don't know why this AVMediaTypeSubtitle track what subtitle file is supported for it

Is it possible play multiple clips using presentMoviePlayerViewControllerAnimated?

I have a situation where I'd like to play 2 video clips back to back using an MPMoviePlayerViewController displayed using presentMoviePlayerViewControllerAnimated.
The problem is that the modal view automatically closes itself as soon as the first movie is complete.
Has anyone found a way to do this?
Three options:
You may use MPMoviePlayerController and start the playback of the 2nd (Nth) item after the previous is complete. This however will introduce a small gap between the videos cause by identification and pre buffering of the content.
You may use AVQueuePlayer; AVQueuePlayer is a subclass of AVPlayer you use to play a number of items in sequence. See its reference for more.
You may use AVComposition for at runtime composing one video out of the two (or N) you need to play back. Note, this works only on locally stored videos and not on remote (streaming or progressive download). Then use AVPlayer for the playback.
It's not possible. If the video assets are in local file system, consider AVComposition.

How can I programmatically pipe a QuickTime movie into a Quartz Composer input?

I'm working on an app that applies Quartz Composer effects to QuickTime movies. Think Photo Booth, except with a QuickTime movie for the input, not a camera. Currently, I am loading a quicktime movie as a QTMovie object, then have an NSTimer firing 30 times a second. At some point I'll switch to a CVDisplayLink, but NSTimer is okay for now. Every time the NSTimer fires, the app grabs one frame of the quicktime movie as an NSImage and passes it to one of the QCRenderer's image inputs. This works, but is extremely slow. I've tried pulling frames from the movie in all of the formats that [QTMovie frameImageAtTime:withAttributes:error:] supports. They are all either really slow, or don't work at all.
I'm assuming that the slowness is caused by moving the image data to main memory, then moving it back for QC to work on it.
Unfortunately, using QC's QuickTime movie patch is out of the question for this project, as I need more control of movie playback than that provides. So the question is, how can I move QuickTime movie images into my QCRenderer without leaving VRAM?
Check out the v002 Movie Player QCPlugin which is open source. Anyway, what more controls do you have exactly?

Modifying video frames with QTKit and OpenGL

I am working on a project where I would like to open a video (on a Mac) with QTKit. That part I can do no problem, but as I am playing it, I would like to edit or modify the video on the fly using OpenGL.
From what I understand, I should be able to intercept the frames and change them before it hits the display, but no matter what I do, I cannot seem to do so.
It sounds like you should have a look at Core Video and the display link mechanic.
You can basically get a callback on a high priority thread with the decoded frame in a CVImageBuffer and do whatever you like with it (including packing it up as a texture for OpenGL processing and display).
Apple provides documentation and demo code snippets on the developer sites.

Playing Audio and Video of a mp4 file separately using AVFoundation Framework

I have developed a media player using AVFoundation for iOS. I am using AVPlayer to play the audio-video files (eg. a mp4 file). It seems quiet simple to play the file using AVPlayer, by directly calling the play, pause APIs.
Now I want to separate the audio and video and play them as individual entities simultaneously. I want to do this because, I may do some editing to the audio or video track, and then play the file.
I can separate the two using AVAssetTracks, but I dont know how to play the tracks. Also, I would like to play the two tracks simultaneously, so that no AVSync problem occurs.
Please guide me how to achieve this target, i.e. audio and video rendering with no AVSync problem.
Thanks..
The easiest way to achieve this would be to have multiple player items. I would create a playerItem with all the tracks in their original form (ie the mp4 file). Then create another asset using AVMutableComposition (a subclass of AVAsset). This allows you to put only certain tracks into the composition (ie the audio track only). When you want to play the audio only, play the playerItem (AVPLayer setCurrentItem:) that has the mutable composition with the audio only track. When you want to play the video only, play the playerItem that has the mutable composition with the video only track. When you want to play both in sync, play the playerItem with the original asset.
I'm assuming you want to play the edited versions in sync. For that you will have to create another AVMutableComposition and add all of the edited tracks. Then call setCurrentItem: with the newly created AVMutableComposition.
If all you are trying to do is edit the different tracks, and never have to play them by themselves, you can do this with one single AVMutableComposition. Just add an audio and video AVMutableCompositionTrack and edit until your hearts content. They will always play in sync no matter how much you edit them separately (assuming your editing logic is correct). Just make sure you don't try to edit while playing. For that, you must create a copy and edit the copy.