How to get first video frame? - objective-c

I am programming media player using VLCKit. I want to take preview picture of the video. How can i do that using VLCKit or maybe another tools?
P.S. I've already used AVFoundation and QTKit, but it didn't work. They argue on video format (.mkv)

You want to use VLCKit's thumbnailer class. It is doing everything for you.

Related

Show subtitles with an AVFoundation AVPlayer on OS X

I'm trying to display subtitles when playing video using AVFoundation on OS X.
I've looked through the documentation and I can't find a way to enable a subtitle track. The API contains multiple references to subtitle tracks, which leads me to believe that it's supported.
On iOS the method -[AVPlayerItem selectMediaOption:inMediaSelectionGroup:] is used to enable subtitle tracks. This method isn't available in the 10.7 SDK. Is there another way to show subtitles?
EDIT:
Quicktime Player X has subtitle support, for example on opening this movie the subtitle submenu appears to offer a choice of language, and will display them when English is chosen. This leads me to believe that they're included in the API...
I ran into this same issue myself. I found that unfortunately the only way to do it, other than switching to QTKit, is to make a separate subtitles layer (a CATextLayer) and position it appropriately as a sublayer to the player layer. The idea is that you set up a periodic time observer to trigger every second or so and update the subtitles, along with (and this is optional) some UI element you might have that shows what the elapsed time is in the video.
I created a basic SubRip (.srt) file parser class; you can find it here: https://github.com/sstigler/SubRip-for-Mac . Be sure to check the wiki for documentation. The class is available under the terms of the BSD license.
Another challenge you might run into is how to dynamically adjust the height of the CATextLayer to account for varying lengths of subtitles, and varying widths of the containing view (if you choose to make it user-resizable). I found a great CALayoutManager subclass that does this, and made some revisions to it in order to get it to work for what I was trying to do: https://github.com/sstigler/height-for-width .
I hope this helps.
The way to add the subtitle file is by adding scc subtitle as AVMediaTypeClosedCaption track to the AVPlayer using AVMutableCompositionTrack and the player will control it I tried it and it worked or by adding a text file with the quick time text or any text as AVMediaTypeText track and the player will show the subtitle I don't know why this AVMediaTypeSubtitle track what subtitle file is supported for it

is it possible to play audio instead of video from Youtube video link?

My requirement is something like this.I have 1 youtube video url and i have to play audio of that particular URL instead of Video so is it possible in iPhone/iPad to play this kind of audio from video link?Please guide me.
I don't think you are allowed to by Youtubes terms of use; I am referring to the API terms. However, it is certainly technically possible. One hackie but simple solution would be to make an invisible view that plays the video; you would never see the video but you should be able to here the audio; just set the view's alpha low.
However, I do not recommend breaking the terms for the API, as I am sure there would be repercussions. Best of luck with your client.

360 degree video in MPMoviePlayerController

I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.

Crop video from QTKit

I'm using the OSX QTKit sample code from here: http://bit.ly/mAaHGI
I'd like to crop the video, both on the screen and the saved file, to simulate different aspect ratios. What is the best way to do this?
It's a bit more involved than just calling a crop method, but Core Video allows you to manipulate the video stream. You can find the Core Video Programming Guide here:
http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/CoreVideo/CVProg_Intro/CVProg_Intro.html

How to make the colored video to black and white in iphone sdk [duplicate]

In my application I have a color video which I want to make as black and white video. Is there any framework that supports this in iOS? If so, how to implement that effect on color video?
Can anyone help in this regard?
A bit late here, but you should check out OpenCV.
It can be compiled as a static lib for the iPhone and can convert video (frame by frame) to grayscale or two color.
You can start here. There are other resources for doing this floating around.
HI Lakshmi this might be familiar to you that at least 24 frames make a clip of video so you have to use a tool like movie maker so that you can see the video frame by frame and edit each frame.
See the opensource Framework named: GPUImage https://github.com/BradLarson/GPUImage/
The filter name GPUImageGrayscaleFilter is what you want
This framework included the sample code, so you can learn how to use GPUImageGrayscaleFilter in your apps easily