Audio equivalent of QTMovieView? - objective-c

does anyone know of an audio equivalent of QTMovieView. Something that allows playback and scrubbing of audio files. QTMovieView without the movie...

QTMovieView, with its height set to [myMovieView controllerBarHeight].

Related

Can we directly use UserMedia stream as texture source in webgl?

I have seen a sample that use video tag to send image data from webcam into webgl texture. It then need to create video tag and each frame need to check and update texture
Which sound not so efficient. I'm curious how can we just use stream from getUserMedia to set as texture src. Or any other way to let shader access webcam as texture directly, without creating video tag
Or it not possible?
Yes, there're plenty of examples: https://www.chromeexperiments.com/webcam-input,webgl?page=0.

How to monitor audio on iPhone headphones from AVCaptureSession?

I want to be able to monitor audio on headphones before and during the capture of video.
I have an AVCaptureSession set up to capture video and audio.
My idea is to hook and AVCaptureAudioDataOutput instance up to the AVCaptureSession for this and process the CMSampleBufferRefs with a class implementing the AVCaptureAudioDataOutputSampleBufferDelegate protocol.
But I am not sure how to route the audio to the headphones from there.
What would be the most straighforward way to do this (highest level frameworks, general approach)?
I ended up implementing this Audio Unit. The remote i/o audio unit to be precise.
Apple's aurioTouch sample code provides a clear example of how to do this.

How can I determine the pixel aspect ratio of a media file using AVFoundation on Lion?

As part of a 64 bit Objective-C program I need to determine some parameters of media files on Lion. For example for a video file, what is the pixel aspect ratio and is the video anamorphic? I've been searching the AVFoundation API with no luck. Any ideas on how to determine this information?
Thanks,
Barrie
Although I am no AVFoundation expert, I would start looking at AVVideoWidthKey and AVVideoHeightKey dictionary keys in the video settings .. see http://developer.apple.com/library/IOs/#documentation/AVFoundation/Reference/AVFoundation_Constants/Reference/reference.html

Modifying video frames with QTKit and OpenGL

I am working on a project where I would like to open a video (on a Mac) with QTKit. That part I can do no problem, but as I am playing it, I would like to edit or modify the video on the fly using OpenGL.
From what I understand, I should be able to intercept the frames and change them before it hits the display, but no matter what I do, I cannot seem to do so.
It sounds like you should have a look at Core Video and the display link mechanic.
You can basically get a callback on a high priority thread with the decoded frame in a CVImageBuffer and do whatever you like with it (including packing it up as a texture for OpenGL processing and display).
Apple provides documentation and demo code snippets on the developer sites.

Best way to export a QTMovie with a fade-in and fade-out in the audio

I want to take a QTMovie that I have and export it with the audio fading in and fading out for a predetermined amount of time. I want to do this within Cocoa as much as possible. The movie will likely only have audio in it. My research has turned up a couple of possibilities:
Use the newer Audio Context Insert APIs. http://developer.apple.com/DOCUMENTATION/QuickTime/Conceptual/QT7-2_Update_Guide/NewFeaturesChangesEnhancements/chapter_2_section_11.html. This appears to be the most modern was to accomplish this.
Use the Quicktime audio extraction APIs to pull out the audio track of the movie and process it and then put the processed audio back into the movie replacing the original audio.
Am I missing some much easier method?
Quicktime has the notion of Tween Tracks. A tween track is a track that allows you to modify the properties of another set of tracks properties (such as the volume).
See Creating a Tween Track in the Quicktime docs to see an example of how to do this with an Quicktime audio track's volume.
There is also a more complete example called qtsndtween on the Apple Developer website.
Of course, all of this code requires using the Quicktime C APIs. If you can live with building a 32-bit only application, you can get the underlying Quicktime-C handles from a QTMovie, QTTrack, or QTMedia object using the "movie", "track", or "media" functions respectively.
Hopefully we'll get all the features of the Quicktime C APIs in the next version of QTKit, whenever that may be.