Show subtitles with an AVFoundation AVPlayer on OS X - objective-c

I'm trying to display subtitles when playing video using AVFoundation on OS X.
I've looked through the documentation and I can't find a way to enable a subtitle track. The API contains multiple references to subtitle tracks, which leads me to believe that it's supported.
On iOS the method -[AVPlayerItem selectMediaOption:inMediaSelectionGroup:] is used to enable subtitle tracks. This method isn't available in the 10.7 SDK. Is there another way to show subtitles?
EDIT:
Quicktime Player X has subtitle support, for example on opening this movie the subtitle submenu appears to offer a choice of language, and will display them when English is chosen. This leads me to believe that they're included in the API...

I ran into this same issue myself. I found that unfortunately the only way to do it, other than switching to QTKit, is to make a separate subtitles layer (a CATextLayer) and position it appropriately as a sublayer to the player layer. The idea is that you set up a periodic time observer to trigger every second or so and update the subtitles, along with (and this is optional) some UI element you might have that shows what the elapsed time is in the video.
I created a basic SubRip (.srt) file parser class; you can find it here: https://github.com/sstigler/SubRip-for-Mac . Be sure to check the wiki for documentation. The class is available under the terms of the BSD license.
Another challenge you might run into is how to dynamically adjust the height of the CATextLayer to account for varying lengths of subtitles, and varying widths of the containing view (if you choose to make it user-resizable). I found a great CALayoutManager subclass that does this, and made some revisions to it in order to get it to work for what I was trying to do: https://github.com/sstigler/height-for-width .
I hope this helps.

The way to add the subtitle file is by adding scc subtitle as AVMediaTypeClosedCaption track to the AVPlayer using AVMutableCompositionTrack and the player will control it I tried it and it worked or by adding a text file with the quick time text or any text as AVMediaTypeText track and the player will show the subtitle I don't know why this AVMediaTypeSubtitle track what subtitle file is supported for it

Related

How to get first video frame?

I am programming media player using VLCKit. I want to take preview picture of the video. How can i do that using VLCKit or maybe another tools?
P.S. I've already used AVFoundation and QTKit, but it didn't work. They argue on video format (.mkv)
You want to use VLCKit's thumbnailer class. It is doing everything for you.

Purpose of SpriteKit sks files

I've watched the WWDC 2014 session (#608 "Best practices for building sprite kit games") a couple of times and I just want to clarify the purpose behind using .sks files. Am I supposed to put separate assets into each .sks file? Here's a little bit of background into what I'm doing. I'm creating a Mac app that will text piano students to play chords using a MIDI keyboard. Chords will appear on the screen and they'll play them one by one and get a score. Here's a mockup of what the app may look like. Side note: for those that may know music, we're using the numeric version of chords instead of explicit names like Cm, etc.
Would I have a separate .sks file for each element of the UI. For instance, one for the green timer bar, one for the piano keys, etc. The example that they use in the video is a pretty simple one. I am subclassing SKSpriteNode for the timer and on-screen piano so how would I handle the resources for those. They are not static objects and they will change either over time (timer) or due to user input (keyboard)
I really want to organize my project using best practices. Please help. Thanks in advance.
Sks files are serialized SKScene objects. The intent is to provide something like interface builder for constructing SKScene scenes visually. The common use case is to layout complex backgrounds or levels and define starting positions. You would only have one file per scene in many cases. However you can use the sks files to organize and serialize conceptual components of a scene, as demonstrated in versions of the Apple Adventure sample code released since the sks format and scene editor were introduced with Xcode 6
In the screenshot above, you could organize the project into sections that are fairly generic and reusable, like the keyboard for one file and the hud atop the scene for another file. However, you could put them all in one file, then duplicate the file for variations on a theme.

Is it possible play multiple clips using presentMoviePlayerViewControllerAnimated?

I have a situation where I'd like to play 2 video clips back to back using an MPMoviePlayerViewController displayed using presentMoviePlayerViewControllerAnimated.
The problem is that the modal view automatically closes itself as soon as the first movie is complete.
Has anyone found a way to do this?
Three options:
You may use MPMoviePlayerController and start the playback of the 2nd (Nth) item after the previous is complete. This however will introduce a small gap between the videos cause by identification and pre buffering of the content.
You may use AVQueuePlayer; AVQueuePlayer is a subclass of AVPlayer you use to play a number of items in sequence. See its reference for more.
You may use AVComposition for at runtime composing one video out of the two (or N) you need to play back. Note, this works only on locally stored videos and not on remote (streaming or progressive download). Then use AVPlayer for the playback.
It's not possible. If the video assets are in local file system, consider AVComposition.

Cocoa: How to morph a drag image while dragging

In Interface Builder.app (and some other cocoa apps), image dragging has a very nice/sexy effect of morphing the drag image while you drag a draggable item out of its window.
For example in Interface Buildler.app:
Show the Library Palette (⇧⌘L, or Tools Menu -> Library)
Drag an item out of the Library palette
NOTE: as you drag the item out of the Library Palette window, it morphs from an image of the original list item to an image of the icon of the dragged item.
I have fully implemented drag and drop in my Application using the normal Cocoa NSDragSource/NSDragDestination facilities.
However, I can't find a hook for doing this image morph while dragging. I'm returning the initial drag image by overriding
-[NSView dragImage:at:offset:event:pasteboard:source:slideBack:]
But this is only called at the beginning of the drag.
How do you signal that you would like to replace the current drag image (ideally using the sexy morph effect).
You guys beat me to it. :-)
Yes, JLNDragEffectManager is open source (with attribution in your apps, please) and available on my blog. It should work fine as-is with no modification back to 10.5, but I'm not sure back any further. Others linked to it (and it's easily googleable), so to avoid self-congratulatory blog linking, I'll leave it at that.
Issues: One developer commented on (and submitted code to fix) the lack of dragging offset support. I've just not gotten around to posting the update. That's the only outstanding issue I'm aware of.
Improvements: I'd like to add multiple "zones" (say, one per document, so dragging from doc to doc keeps table rows looking like table rows, but anywhere outside doc windows turns them into a file icon a la HFS Promise Drag). Some day ...
Design: The post itself details the reasoning behind the design and the relatively simple morphing effect (cross-fade plus size are animated using basic NSAnimation, etc.). The code (the class as well as the demo app) is thoroughly blocked out and commented.
Won't link to my own post but would love the karma of upvotes for my effort. ;-)
UPDATE: Similar (but better-integrated) functionality is available as of 10.7. If you are targeting 10.7 or higher, it's best to use the new API. JLNDragEffectManager works fine on 10.7, so it can be used for earlier-targeted versions.
JNLDragEffectManager does exactly that. :)
The API does not support this well. Joshua Nozzi gives a method that looks reasonable in this weblog post.
IB's effect isn't that fancy. It's a crossfade and scale. Hold down shift to see it more clearly.
As of 10.7+ the current approach is to use the
enumerateDraggingItemsWithOptions:
forView:
classes:
searchOptions:
usingBlock:
API on NSDraggingInfo. The documentation is really poor but the ADC samples like MultiPhotoFrame or TableViewPlayground can give a good idea on how to use the new mechanism.

Best way to export a QTMovie with a fade-in and fade-out in the audio

I want to take a QTMovie that I have and export it with the audio fading in and fading out for a predetermined amount of time. I want to do this within Cocoa as much as possible. The movie will likely only have audio in it. My research has turned up a couple of possibilities:
Use the newer Audio Context Insert APIs. http://developer.apple.com/DOCUMENTATION/QuickTime/Conceptual/QT7-2_Update_Guide/NewFeaturesChangesEnhancements/chapter_2_section_11.html. This appears to be the most modern was to accomplish this.
Use the Quicktime audio extraction APIs to pull out the audio track of the movie and process it and then put the processed audio back into the movie replacing the original audio.
Am I missing some much easier method?
Quicktime has the notion of Tween Tracks. A tween track is a track that allows you to modify the properties of another set of tracks properties (such as the volume).
See Creating a Tween Track in the Quicktime docs to see an example of how to do this with an Quicktime audio track's volume.
There is also a more complete example called qtsndtween on the Apple Developer website.
Of course, all of this code requires using the Quicktime C APIs. If you can live with building a 32-bit only application, you can get the underlying Quicktime-C handles from a QTMovie, QTTrack, or QTMedia object using the "movie", "track", or "media" functions respectively.
Hopefully we'll get all the features of the Quicktime C APIs in the next version of QTKit, whenever that may be.