extracting a video from streaming server with AVfoundation in ios? - objective-c

I was able to display a video using NSURL from a webpage into a UIView, but i need to be able to find a way to stream a video from a web server into my iphone using avfoundation. i am a little confused as to what i have to do in order to do this. any pointers would be helpful. thank you.

If you just need simple movie playing then take a look at MPMoviePlayerController. For more advanced needs start at AVPlayer

Related

Agora WebRTC Change video call view

I am working on a react-native project which uses Agora.io for video calling.
In a video call it shows my camera feed as fullscreen and the reciever's feed as thumbnail which is the opposite of the correct way.
I want to know, Is this the way agora works or is it possible to fix this..?
Because even in their website they have put the images in that way.
image on the home page
I appreciate any help regarding to fix this.
So it seems like you are presenting the local video stream to the larger view. You would need to switch this. Render the remote video stream on the larger view and the local video stream on the thumbnail view.

MPMoviePlayer video quality is not the same as original video quality on Youtube

I am trying to play YouTube videos in IOS App and it working fine. But the issue is in quality, Quality is not same in MPMoviePlayer as video's are on YouTube.
I have tried different approaches but did not get any solutions.
like : YTPlayerView,LBYouTube etc...
Please help me if anyone have a solution for that.
Thanks in advance.
I read this somewhere...it working fine...
Try it by using
LBYouTubePlayerViewController
It is a subclass of MPMoviePlayerViewController.
LBYouTubeView is a small view that is able to display YouTube videos in a MPMoviePlayerController.
you have choice to select between high-quality and standard quality.
LBYouTubeView doesn't use UIWebView which makes it faster .

Playing back a CMSampleBuffer

I'm trying to create an application that streams video and audio to an other computer.
On the "server" side, I'm able to capture (using AVCaptureSession) video and audio, to preview them and to send them over the network using the delegate and reconstruct everything back on the other side.
On "client" side I've then a CMSampleBuffer that contains audio and video and I don't find the way to play it back. I've checked AVPlayer and AVCaptureSession, but I don't understand the mechanism with a CMSampleBuffer for input.
Any ideas, links? Thank you!
you could try AVSampleBufferDisplayLayer.

How to embed a video player in NSView?

I want to embed a video player in an NSView. I also want to play different files, passing the path of these files. Is it possible to implement in a Mac application?
For QTKit, the QTKit Application Tutorial - Creating a Simple QTKit Media Player Application takes you through the basic steps.
Take a look at the AVFoundation frameworks now in 10.7 - AVPlayer looks like a good start.

Play video files on iPad

I am on the lookout for some sample code to help me learn how to play movie files (.mp4) on the iPad.
Can anyone help? I can only find a iPhone sample which doesn't work.
See the AVFoundation framework, particularly AVPlayer. There’s also a higher-level class called MPMoviePlayerController that offers simple video playback (it’s easier to use but cannot be customized).