MPMoviePlayerController H.264 and Multiple Audio Streams - objective-c

I am trying to find some information on how to support multiple audio streams from a single H.264/MPEG4 video file.
So far I have found very little information when googling, I was wondering if anybody has any information that may shed some light.
I would like to display the video then have a choice of which audio stream to play from the H.264 format.
Anybody?

MPMoviePlayer cannot be used to play a movie with multiple audio streams.

Related

how to get Dash segments of .mp4 video file

I have mp4 video file,which i need to load on my page,i am using MSE for that,but i don't know how can i get my video in segments with .m4s extensions,with header.m4s as parent segment with all information about my video file stored in it?Please help.
I believe that if a video is embedded on the website, it can be downloaded.
The only thing you could do is make it difficult for download.
This might be helpful. It says using a flash video is a good option to make downloading videos a bit difficult. Never used it but you could give it a try.
To protect the video, you should probably not try to artificially obfuscate the video loading. MPEG DASH supports encrypted MP4 video and common encryption (CENC), that could be a thing you can look into.

Play video on Cocoa application

I have tried to use AVPlayer to play a video with my Cocoa Application. However, it turns out that AVPlayer is capable of playing only a very restricted variety of video types. As an example, .avi files and .flv files will not get played by AVPlayer. I suppose that AVPlayer will play only those video types that Quicktime will play (i.e., quite a few!).
So I was looking for alternatives: I will need my software to play the largest variety possible of videos since I cannot know in advance which kind of videos will my users open.
I have Mac OSX 10.10 (Yosemite). What are the alternatives that are available to me?
Thank you very much indeed
You are correct in thinking AVPlayer only plays Quicktime compatible media.
Things I can think of, off the top of my head (which may / may not be suitable, but I don't know what other restrictions you have):
Offer to transcode the video using ffmpeg or similar.
See if "Perian" helps (I'm not sure AVPlayer machinery will see it). Perian is an older Quicktime (32bit) plugin that includes many more codecs. It's deprecated and more or less going away, so it's not a long term solution... if it works at all.

How to compress Video data while taking video from camera?

Is there any way to compress the video data while taking from camera ? There is huge difference in video data bytes from taking camera and from photo library.I want to reduce some memory while taking video from camera. Is any way ?
Thanks
I filed a bug report with Apple on this matter, you could do the same, seems the more reports from developers the faster they fix things up.
No matter what videoQuality level you set on the UIImagePickerController, it always defaults to High when recording from the camera. Videos chosen from the user library respect your choice and compress really well with the hardware H.264 encoder present on the 3GS and up.
You can use FFMpeg to get video directly from camera, compress it and store it to a file.
Also FFMpeg is a standalone console application, and it doesn't need any other dlls.
Of course, this isn't objective-c, but it can be very useful in your case.

Streaming encoded video in Adobe AIR Application

I am developing a desktop application in Adobe AIR that will be used to stream the user's camera video to a wowza media server. I want to encode the video on the fly, means transmit the H.264 encoded video instead of the default flash player encoded video for quality purpose. Is there any way around for this?
Waiting for the help from people around,
Rick
H.264 encoding is usually done in Native Code C or C++ because it is a cpu
intensive set of algorithms. The source code for x264 can give you an
idea of the code required but it is a tough read if you start from scratch.
Here is a book to get you started or you can read the original AVC standard
if you suffer from insomnia.

Is there a way to stream audio from MIC and play that stream in Silverlight

So I want to stream the audio from a mic using NAudio and then pass that stream to WCF which a Siverlight app can consume to broadcast the live audio sound. I want the latency to be as low as possible.
Any suggestions or if some one has already done it please point the source. Thanks in advance
what you are asking is certainly possible, but will be a fair amount of work to do.
NAudio can handle to capturing microphone audio.
At the Silverlight end you can play custom audio formats (in this case PCM) using a custom media element streaming source. See this one: http://code.msdn.microsoft.com/wavmss
I suspect latency would not be very good. You can reduce it by keeping the buffer sizes small. Also bear in mind that WAV is not a very efficient format to be sending over the network.
To have low latency as possible, you should use the netTcpBinding and stream your audio in binary format. I would use MemoryStream for this and try to play with the buffersize to figure out what the best performance is. Also, try checking audio formats for best performance. This also depends of the audio quality you expect.