Modifying video frames with QTKit and OpenGL - objective-c

I am working on a project where I would like to open a video (on a Mac) with QTKit. That part I can do no problem, but as I am playing it, I would like to edit or modify the video on the fly using OpenGL.
From what I understand, I should be able to intercept the frames and change them before it hits the display, but no matter what I do, I cannot seem to do so.

It sounds like you should have a look at Core Video and the display link mechanic.
You can basically get a callback on a high priority thread with the decoded frame in a CVImageBuffer and do whatever you like with it (including packing it up as a texture for OpenGL processing and display).
Apple provides documentation and demo code snippets on the developer sites.

Related

Detecting what part of screen/application window is using OpenGL?

I am interested in identifying which part of application is making use of OpenGL.
Take an example of Chrome where Youtube video being played in Flash (get rendered via OpenGL). I am interested in detecting only the area of application where that OpenGL activity is being done?
If the condition is that I need to be inside the application, like say to inject in Chrome, I can do that too.
Let me know if I can clarify question more.
You tagged your question as MacOS X. Then you can simply assume everything on screen being drawn using OpenGL, because OpenGL is used as the graphics backend for the whole system.
Their is this private API which allow you to know the surface on which opengl is rendering.
CG_EXTERN CGError CGSGetSurfaceBounds(CGSConnectionID, CGWindowID, CGSSurfaceID, CGRect* bounds);
Using this we can detect specific area of application which makes use of OpenGL.

ios app question video effects

I am trying to piece together a solution to let users take and edit videos in an app. I have seen the 8mm app and am wondering how they did it... and made it so smooth.
At first I was thinking the effects might have been a series of pngs streamed together like a animated gif and then placed on top of the real video. but then for merging the images to the video I am at a loss. Also the app is so smooth I think it has to be using some low level Core.media Framework but am not sure.
Any ideas or advise on where to begin?
Thanks
AVFoundation combined with OpenGL ES 2.0 (with shaders) provides great performances for adding effects to camera / video in realtime (and even better with the ios 5 but i can't say too much due to the NDA).
You probably read most the documentation of AVFoundation to start with, because there is a lot going on. One method that might be of interests is this one:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;
which allow you to work directly with blocks of data representing video information coming from the camera. You can then modify this data to change the video information, for example, adding additional content or pictures on top of the video frame. You can use Open GL ES to do this processing.

360 degree video in MPMoviePlayerController

I am trying to develop an iphone application which needs to show a 360 degree video like the one and rotate the video as per the phone movement. How can i do this? Is it possible to do this with normal MPMovieplayer controller?
I don't think you can do this with a normal MPMoviePlayerController, but there are several libraries out there to achieve this. Have a look here:
PanoramaGL
Panorama 360
They work with OpenGL and you can embed them in your Objective-C code.
EDIT:
As #Mangesh Vyas kindly pointed out those are intended to use with fixed images only. However they might be a suitable starting point for embedding video as well, if you modify the code accordingly. They already do the handling of direction, accelerometer etc. so you don't have to implement all that yourself.

Crop video from QTKit

I'm using the OSX QTKit sample code from here: http://bit.ly/mAaHGI
I'd like to crop the video, both on the screen and the saved file, to simulate different aspect ratios. What is the best way to do this?
It's a bit more involved than just calling a crop method, but Core Video allows you to manipulate the video stream. You can find the Core Video Programming Guide here:
http://developer.apple.com/library/mac/#documentation/GraphicsImaging/Conceptual/CoreVideo/CVProg_Intro/CVProg_Intro.html

Best way to export a QTMovie with a fade-in and fade-out in the audio

I want to take a QTMovie that I have and export it with the audio fading in and fading out for a predetermined amount of time. I want to do this within Cocoa as much as possible. The movie will likely only have audio in it. My research has turned up a couple of possibilities:
Use the newer Audio Context Insert APIs. http://developer.apple.com/DOCUMENTATION/QuickTime/Conceptual/QT7-2_Update_Guide/NewFeaturesChangesEnhancements/chapter_2_section_11.html. This appears to be the most modern was to accomplish this.
Use the Quicktime audio extraction APIs to pull out the audio track of the movie and process it and then put the processed audio back into the movie replacing the original audio.
Am I missing some much easier method?
Quicktime has the notion of Tween Tracks. A tween track is a track that allows you to modify the properties of another set of tracks properties (such as the volume).
See Creating a Tween Track in the Quicktime docs to see an example of how to do this with an Quicktime audio track's volume.
There is also a more complete example called qtsndtween on the Apple Developer website.
Of course, all of this code requires using the Quicktime C APIs. If you can live with building a 32-bit only application, you can get the underlying Quicktime-C handles from a QTMovie, QTTrack, or QTMedia object using the "movie", "track", or "media" functions respectively.
Hopefully we'll get all the features of the Quicktime C APIs in the next version of QTKit, whenever that may be.