Play video on Cocoa application - objective-c

I have tried to use AVPlayer to play a video with my Cocoa Application. However, it turns out that AVPlayer is capable of playing only a very restricted variety of video types. As an example, .avi files and .flv files will not get played by AVPlayer. I suppose that AVPlayer will play only those video types that Quicktime will play (i.e., quite a few!).
So I was looking for alternatives: I will need my software to play the largest variety possible of videos since I cannot know in advance which kind of videos will my users open.
I have Mac OSX 10.10 (Yosemite). What are the alternatives that are available to me?
Thank you very much indeed

You are correct in thinking AVPlayer only plays Quicktime compatible media.
Things I can think of, off the top of my head (which may / may not be suitable, but I don't know what other restrictions you have):
Offer to transcode the video using ffmpeg or similar.
See if "Perian" helps (I'm not sure AVPlayer machinery will see it). Perian is an older Quicktime (32bit) plugin that includes many more codecs. It's deprecated and more or less going away, so it's not a long term solution... if it works at all.

Related

do MobileVLCKit support Airplay option in iOS?

I have implemented VCL to play the live streaming video and Now I want to play it on apple TV by airplay option on my streaming screen..
Can you pls help me out.
Thanks
By question its not clear what you mean by implementing VLC. Apple provided framework to stream live video (AVPlayer). Which supports Picture-in-picture and airplay.
For making things more clear please elaborate, what exactly you are trying to do.
Support for AirPlay will be available in libvlc and VLCKit 4.0 next year. It will allow you play anything from any source on Apple TV and if needed, will convert the media on-the-fly. Thus, AirPlay support will match what's possible with Chromecast right now.
With VLCKit 3, there is no good way to do it. You could do Display Mirroring, but this would have a bad impact on performance and video quality. Audio-only output via AirPlay / RAOP will just work fine and quality is good. It even supports multichannel now.

Mac OS X equivalent for DirectShow, GraphEdit

New to Mac OS X, familiar with Windows. Windows has DirectShow, a good number of built-in filters, COM programming, and GraphEdit for very fast prototyping and snooping on the graphs you've constructed in code.
I'm now about to go to the Mac to work with cameras, webcams, microphones, color spaces, files, splitting, synchronization, rendering, file reading, file saving, and many of things I've come to take for granted with DirecShow when putting together applications for live performance. On the Mac side, so far I've found ... nothing! Either I don't know where to look or I'm having the toughest time tying the Mac's reputation for its ease of handling media with a coherent programmatic ability to get in there and start messin' with media manipulatin' building blocks.
I've seen some weak suggestions to use gstreamer or some library for QT but I can't bring myself to believe that this is the Apple way to go. And I've come across some QuickTime documentation but I'm not looking to do transitions, sprites, broadcasting, ...
Having a brain trained on DirectShow means I don't even know how Apple thinks about providing DirectShow-like functionality. That means I don't know the right keywords and don't even know where to look. Books? Bought a few. Now I might be able to write some code that can edit your sister's wedding video (if I can't make decent headway on this topic I may next be asking what that'd be worth to you), but for identifying what filters are available and how to string them together ... nothing. Suggestions?
Video handling is going through a huge transition on the Mac at the moment. QuickTime is very old, but also big and powerful, so it's been undergoing an incremental replacement process for the past 5 years or so.
That said, QTKit is the QuickTime subset (capture, playback, format conversion and basic video editing) which is supported going forward. The legacy QuickTime APIs are still there for the moment, and probably will remain at least until its major features are available elsewhere, but are 32-bit only. For some involved video stuff you may end up needing to use it in places.
At the moment, iOS is ahead of the Mac because it could start from scratch with AV Foundation. The future of the Mac media frameworks will probably either be AV Foundation directly (with QTKit being a lightweight shim over the top) or an extension of QTKit that looks very similar.
For audio there's Core Audio which is on Mac and iOS and isn't going away any time soon. It's quite powerful but somewhat obtuse in places. Luckily online support is very good; the mailing list is an essential resource.
For filters and frame-level processing you've got Core Video as someone else mentioned, as well as Core Image. For motion graphics there's Quartz Composer which includes a graphical editor and a plugin architecture to add your own patches. For programmatic procedural animation and easily mixing rendering modelsĀ (OpenGL, Quartz, video, etc.) there's Core Animation.
In addition to all of these, of course there's no reason you can't use open source libraries where the built-in stuff doesn't do what you want.
To address your comment below:
In QuickTime (and QTKit), individual data types like audio and video are represented as tracks. It may not be immediately clear that QuickTime can open audio as well as video file formats. A common way to combine audio and video would be:
Create a QTMovie with your video file.
Create a QTMovie with your audio file.
Take the QTTrack object representing the audio and add it to the QTMovie with the video in it.
Flatten the movie, so it doesn't simply contain a reference to the other movie but actually contains the audio data.
Write the movie to disk.
Here's an example from Blender. You'll see how the A/V muxing is done in the end_qt function. There's also some use of Core Audio in there (AudioConverter*). (There's some classic QuickTime export code in quicktime_export.c but it doesn't seem to do audio.)

How to compress Video data while taking video from camera?

Is there any way to compress the video data while taking from camera ? There is huge difference in video data bytes from taking camera and from photo library.I want to reduce some memory while taking video from camera. Is any way ?
Thanks
I filed a bug report with Apple on this matter, you could do the same, seems the more reports from developers the faster they fix things up.
No matter what videoQuality level you set on the UIImagePickerController, it always defaults to High when recording from the camera. Videos chosen from the user library respect your choice and compress really well with the hardware H.264 encoder present on the 3GS and up.
You can use FFMpeg to get video directly from camera, compress it and store it to a file.
Also FFMpeg is a standalone console application, and it doesn't need any other dlls.
Of course, this isn't objective-c, but it can be very useful in your case.

How to programmatically test for audio sync

I have a multimedia application that among other things converts video using FFMpeg. Video conversion being the pain that it is, I have in my test suits some tests that check our ability to convert various video formats, with emphasis on sample videos known not to work.
A common problem we've noticed from users is that some videos end up with their audio desynched after being processed, and I am looking for a way to check this in my tests.
Extracting the audio portion of the resulting videos is not a problem.
My best idea so far would be to check the offset of the first non-silence at both the beginning and end and compare each between the two videos, but I'm hoping someone smart has a better idea.
The application language/environment is Java, but since this is for testing, I'm free to use any toolset.
The basic problem is likely that the video and audio are different lengths. Extract the audio and test its length vs. the video length. If they are significantly different (more than maybe .05 sec, I'm not really sure what is detectable as "off"), then there's a problem.
To fix it, re-encode the audio to match the video length, and then put the audio and video back into a container format.

Video Codecs supported in UPnP AV

I'm may just be confused how this all works...in which case, please explain it to me. But, what video codecs are supported by the UPnP AV standard? Or is it on a per-device (client) basis? I want to create an app to send video data to a UPnP device (XBox, PS3, etc) but am not really sure what video codec I should target... it can be anything, I just want to know if there's a way of knowing that it will work on everything.
Edit: Ok, so I will clarify that I will be able to choose whatever video format I want, and once I do that will be all I plan on supporting, therefore I don't need transcoding. My main point was that I was hoping there was some "standard" format used that would be supported on ALL devices so that I could just pick that and be done with it... Obviously this is not the case... but is there any sort of unofficial codec that most devices support? Is there a list of devices and supported codecs anywhere?
Also, how does DLNA work into this...if I understand correctly it's sort of a subset of UPnP AV (but plus some other stuff...) And most UPnP devices I've seen are also DLNA compliant...so would just using whatever codecs DLNA supports be a way to have a common ground?
Doug is right, it depends on the client device.
You could build it so that your server transcodes files on the fly to make them available on the UPnP server, though. It would be easier just to choose a file format that is compatible with all of your devices (if the list is small enough for that to be possible).
For example, you cannot play H.264 encoded video in MKV format on Xbox 360 or PS3 right now. However, you can transcode the files to a format that IS supported. There are many of guides available online for transcoding these files for PS3/360 and what formats are supported by each device.
Here are a couple example guides:
Xbox 360 Conversion Guide
PS3 Conversion Guide
From what i understand it really depends on what codecs the device has installed - therefore it can be anything supported by your device.
I know for example that ps3 supports divx and xbox does not (unless you have windows 7 to transcode for you)
For DLAN device, some video format are must ,some are optional.
Home Devices
MUST : MPEG2
Optional :MPEG1, MPEG4, WMV9
Mobile/Handheld Devices
MUST : MPEG4 AVC (AAC LC Assoc Audio)
Optional :VC1, H.263, MPEG4 part 2, MPEG2, MPEG4 AVC (BSAC or other for Assoc. Audio)
Any other video codecs does not be mentioned here are optional , in my understanding.
check it here: