Force MPMoviePlayer Youtube video - objective-c

I'm trying to open a mpmovieplayer when the webview is finish loading. The problem is the MPMoviePlayer opens and close and gives me this error:
_itemFailedToPlayToEnd: {
kind = 1;
new = 2;
old = 0;
}
I'm using following code which i've found by searching on the internet. Why wont it play the youtube video?
- (void)webViewDidFinishLoad:(UIWebView *)_webView {
NSURL *movieURL = [NSURL URLWithString:#"http://www.youtube.com/embed/AhE5dsO8xF4"];
MPMoviePlayerViewController *mp = [[MPMoviePlayerViewController alloc] initWithContentURL:movieURL];
[self presentMoviePlayerViewControllerAnimated:mp];
[mp.moviePlayer play];
}

Related

tvos AVPlayerViewController how to display subtitles tab on swipedown menu

In tvOS, in the menu that is being displayed when a user swipes down on the remote, that shows "subtitles, audio & info" on other movie apps, how to create another tab with buttons?
Below is my code:
AVMutableMetadataItem *titleMetadataItem = [[AVMutableMetadataItem alloc] init];
titleMetadataItem.locale = NSLocale.currentLocale;
titleMetadataItem.key = AVMetadataCommonKeyTitle;
titleMetadataItem.keySpace = AVMetadataKeySpaceCommon;
titleMetadataItem.value = #"The Title";
NSArray *externalMetadata = [[NSArray alloc] initWithObjects:titleMetadataItem, nil];
_player.player.currentItem.externalMetadata = externalMetadata;
Can someone please tell me how can I create a buttons in the swipe down menu of an AVPlayerViewController so that a user can toggle between turning off or turning on the subtitles? I do not have the srt files embedded in the video. Instead I have a separate subtitle parser and I display it on a label. I was able to get info section to show with text but is there any way to add buttons?
OR
how I can add a subtitle option to the video?
This does not work:
_player.requiresFullSubtitles = YES;
Thanks!
The AVPlayerViewController only loads subtitles if they're embedded in the HLS streams, and that is also the only legal way to show the "Subtitles" tab on the swipe down menu.
I built a utility for dynamically adding VTT subtitles to HLS (m3u8) streams called ACHLSCaptioningServer (https://github.com/acotilla91/ACHLSCaptioningServer).
Note: If you only have access to SRT files you'll need to find a way to convert SRTs to VTTs.
How to use:
NSURL *origStreamURL = your_original_stream-url;
NSURL *vttFileURL = your_vtt_file_url;
NSURL *streamURL = [[ACHLSCaptioningServer sharedInstance] getCaptionedHLSStreamFromStream:origStreamURL vttURL:vttFileURL];
//
// play stream
//
AVURLAsset *videoAsset = [[AVURLAsset alloc] initWithURL:streamURL options:nil];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:videoAsset];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerViewController *avPlayerController = [[AVPlayerViewController alloc] initWithNibName:nil bundle:nil];
[avPlayerController setPlayer:player];
[avPlayerController.view setFrame:self.view.frame];
[self addChildViewController:avPlayerController];
[self.view addSubview:avPlayerController.view];
[avPlayerController didMoveToParentViewController:self];

Streaming Audio only from a link

I have went through many youtube tutorials and looked everywhere. I want to create a live streaming app, I want to click the button play and it plays the streaming audio, and when i click the Stop button it stops. here is my code. Although on my device no sound actually plays. I don't know what Im doing wrong.
I have imported #import <AVFoundation/AVFoundation.h> in both .m and .h
-(IBAction)LiveStream3:(id)sender {
NSString *stream = #"http://audio2.radioreference.com/il_chicago_police2.mp3";
NSURL *url = [NSURL URLWithString: stream];
myAudio = [[AVAudioPlayer alloc] initWithContentsOfURL:url fileTypeHint:stream error:nil];
myAudio.numberOfLoops = 0;
[myAudio play];
[streamTitle setText:#"Chicago"];
NSLog(#"Playing");
}
-(IBAction)stopAudio:(id)sender {
[myAudio stop];
[streamTitle setText:#""];
NSLog(#"Stop");
}

AVFoundation - why can't I get the video orientation right

I am using AVCaptureSession to capture video from a devices camera and then using AVAssetWriterInput and AVAssetTrack to compress/resize the video before uploading it to a server. The final videos will be viewed on the web via an html5 video element.
I'm running into multiple issues trying to get the orientation of the video correct. My app only supports landscape orientation and all captured videos should be in landscape orientation. However, I would like to allow the user to hold their device in either landscape direction (i.e. home button on either the left or the right hand side).
I am able to make the video preview show in the correct orientation with the following line of code
_previewLayer.connection.videoOrientation = UIDevice.currentDevice.orientation;
The problems start when processing the video via AVAssetWriterInput and friends. The result does not seem to account for the left vs. right landscape mode the video was captured in. IOW, sometimes the video comes out upside down. After some googling I found many people suggesting that the following line of code would solve this issue
writerInput.transform = videoTrack.preferredTransform;
...but this doesn't seem to work. After a bit of debugging I found that videoTrack.preferredTransform is always the same value, regardless of the orientation the video was captured in.
I tried manually tracking what orientation the video was captured in and setting the writerInput.transform to CGAffineTransformMakeRotation(M_PI) as needed. Which solved the problem!!!
...sorta
When I viewed the results on the device this solution worked as expected. Videos were right-side-up regardless of left vs. right orientation while recording. Unfortunately, when I viewed the exact same videos in another browser (chrome on a mac book) they were all upside-down!?!?!?
What am I doing wrong?
EDIT
Here's some code, in case it's helpful...
-(void)compressFile:(NSURL*)inUrl;
{
NSString* fileName = [#"compressed." stringByAppendingString:inUrl.lastPathComponent];
NSError* error;
NSURL* outUrl = [PlatformHelper getFilePath:fileName error:&error];
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 30] };
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings };
NSDictionary* videoOptions = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
AVAssetWriter* assetWriter = [AVAssetWriter assetWriterWithURL:outUrl fileType:AVFileTypeMPEG4 error:&error];
assetWriter.shouldOptimizeForNetworkUse = YES;
[assetWriter addInput:writerInput];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:inUrl options:nil];
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// !!! this line does not work as expected and causes all sorts of issues (videos display sideways in some cases) !!!
//writerInput.transform = videoTrack.preferredTransform;
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOptions];
AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
[assetReader addOutput:readerOutput];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[assetReader startReading];
[writerInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
/* snip */
}];
}
The problem is that modifying the writerInput.transform property only adds a tag in the video file metadata which instructs the video player to rotate the file during playback. That's why the videos play in the correct orientation on your device (I'm guessing they also play correctly in a Quicktime player as well).
The pixel buffers captured by the camera are still laid out in the orientation in which they were captured. Many video players will not check for the preferred orientation metadata tag and will just play the file in the native pixel orientation.
If you want the user to be able to record video holding the phone in either landscape mode, you need to rectify this at the AVCaptureSession level before compression by performing a transform on the CVPixelBuffer of each video frame. This Apple Q&A covers it (look at the AVCaptureVideoOutput documentation as well):
https://developer.apple.com/library/ios/qa/qa1744/_index.html
Investigating the link above is the correct way to solve your problem. An alternate fast n' dirty way to solve the same problem would be to lock the recording UI of your app into only one landscape orientation and then to rotate all of your videos server-side using ffmpeg.
In case it's helpful for anyone, here's the code I ended up with. I ended up having to do the work on the video as it was being captured instead of as a post processing step. This is a helper class that manages the capture.
Interface
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface VideoCaptureManager : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>
{
AVCaptureSession* _captureSession;
AVCaptureVideoPreviewLayer* _previewLayer;
AVCaptureVideoDataOutput* _videoOut;
AVCaptureDevice* _videoDevice;
AVCaptureDeviceInput* _videoIn;
dispatch_queue_t _videoProcessingQueue;
AVAssetWriter* _assetWriter;
AVAssetWriterInput* _writerInput;
BOOL _isCapturing;
NSString* _gameId;
NSString* _authToken;
}
-(void)setSettings:(NSString*)gameId authToken:(NSString*)authToken;
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
-(void)startPreview;
-(void)stopPreview;
-(void)startCapture;
-(void)stopCapture;
#end
Implementation (w/ a bit of editing and a few little TODO's)
#implementation VideoCaptureManager
-(id)init;
{
self = [super init];
if (self) {
NSError* error;
_videoProcessingQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
_captureSession = [AVCaptureSession new];
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
[_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
_videoOut = [AVCaptureVideoDataOutput new];
_videoOut.videoSettings = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
_videoOut.alwaysDiscardsLateVideoFrames = YES;
_videoIn = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error];
// handle errors here
[_captureSession addInput:_videoIn];
[_captureSession addOutput:_videoOut];
}
return self;
}
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
{
_previewLayer.connection.videoOrientation = orientation;
for (AVCaptureConnection* item in _videoOut.connections) {
item.videoOrientation = orientation;
}
}
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
{
return _previewLayer;
}
-(void)startPreview;
{
[_captureSession startRunning];
}
-(void)stopPreview;
{
[_captureSession stopRunning];
}
-(void)startCapture;
{
if (_isCapturing) return;
NSURL* url = put code here to create your output url
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 1],
};
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings
};
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
_writerInput.expectsMediaDataInRealTime = YES;
NSError* error;
_assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:&error];
// handle errors
_assetWriter.shouldOptimizeForNetworkUse = YES;
[_assetWriter addInput:_writerInput];
[_videoOut setSampleBufferDelegate:self queue:_videoProcessingQueue];
_isCapturing = YES;
}
-(void)stopCapture;
{
if (!_isCapturing) return;
[_videoOut setSampleBufferDelegate:nil queue:nil]; // TODO: seems like there could be a race condition between this line and the next (could end up trying to write a buffer after calling writingFinished
dispatch_async(_videoProcessingQueue, ^{
[_assetWriter finishWritingWithCompletionHandler:^{
[self writingFinished];
}];
});
}
-(void)writingFinished;
{
// TODO: need to check _assetWriter.status to make sure everything completed successfully
// do whatever post processing you need here
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection;
{
NSLog(#"Video frame was dropped.");
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if(_assetWriter.status != AVAssetWriterStatusWriting) {
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[_assetWriter startWriting]; // TODO: need to check the return value (a bool)
[_assetWriter startSessionAtSourceTime:lastSampleTime];
}
if (!_writerInput.readyForMoreMediaData || ![_writerInput appendSampleBuffer:sampleBuffer]) {
NSLog(#"Failed to write video buffer to output.");
}
}
#end
For compressing /Resizing the video ,we can use AVAssetExportSession.
We can uppload a video of duration 3.30minutes.
If the video duration will be more than 3.30minutes,it will show a memory warning .
As here we are not using any transform for the video,the video will be as it is while recording.
Below is the sample code for compressing the video .
we can check the video size before compression and after compression.
{
-(void)trimVideoWithURL:(NSURL *)inputURL{
NSString *path1 = [inputURL path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path1];
NSLog(#"size before compress video is %lu",(unsigned long)data.length);
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset640x480];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = paths[0];
NSFileManager *manager = [NSFileManager defaultManager];
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
outputURL = [outputURL stringByAppendingPathComponent:#"output.mp4"];
fullPath = [NSURL URLWithString:outputURL];
// Remove Existing File
[manager removeItemAtPath:outputURL error:nil];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(1.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSString *path = [fullPath path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path];
NSLog(#"size after compress video is %lu",(unsigned long)data.length);
NSLog(#"Export Complete %d %#", exportSession.status, exportSession.error);
/*
Do your neccessay stuff here after compression
*/
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
}];}

Display Image and play corresponding audio

I like to display and image and play corresponding audio file. But it is playing audio files first before displaying. I couldn't figure it what is wrong.
-(IBAction)playButton :(id)sender{
UIImage *imageA = [UIImage imageNamed:#"Image1.png"];
UIImage *imageB = [UIImage imageNamed:#"Image2.png"];
int randomAlphaNum = arc4random() % 2;
NSLog(#"%i", randomAlphaNum);
switch (randomAlphaNum) {
case 0:
imageView.image = imageA;
for (int i = 1; i <=5; i++) {
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/audioA.wav", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.volume = 0.5;
if (audioPlayer == nil)
NSLog(#"An audio error occured: \"%#\"", audioPlayer);
else
{
[audioPlayer play];
}
sleep(2);
}
return;
break;
}
- (void)viewDidLoad
{
[super viewDidLoad];
}
- (void)viewDidUnload
{
[super viewDidUnload];
}
First of all I wouldn't recommend using sleep as it's going to chew up your main thread and make your UI unresponsive. I'd actually be curious if removing it alleviates your issue. There's a chance the app is flying through your code, playing the audio and is hitting the sleep command before the app can display the image on the screen. What happens if you remove sleep and only iterate through the for loop once?

How to play a Vimeo video in iOS?

I've tried to search to web, but I couldn't find a topic not older than 1 year regarding this problem, therefore;
How can I play a Vimeo video in an iOS App?
EDIT1: When using the solution I'm sometimes getting this HTTP response from Vimeo
Why?
This is my way of play a Vimeo video inside a app.
I am using iFrame to load Vimeo video inside my app.
follow this steps and you will too.
create a uiwebview and connect it to your .h file. Mine is _webView.
Add this method to your .m file.
-(void)embedVimeo{
NSString *embedHTML = #"<iframe width=\"300\" height=\"250\" src=\"http://www.vimeo.com/embed/rOPI5LDo7mg\" frameborder=\"0\" allowfullscreen></iframe>";
NSString *html = [NSString stringWithFormat:embedHTML];
[_webView loadHTMLString:html baseURL:nil];
[self.view addSubview:_webView];
}
I am using the embedded code in Vimeo video. (I hope you know what it is)
call this method inside your viewdidload
[self embedVimeo];
Run the app and you will see the video in your view. This way is perfectly working for me and i think this will help for your too.
You can use YTVimeoExtractor, works fine for me.
You can use this code
NSString *htmlStringToLoad = [NSString stringWithFormat:#"http://player.vimeo.com/video/%#?title=0&byline=0&portrait=0\%%22%%20width=\%%22%0.0f\%%22%%20height=\%%22%0.0f\%%22%%20frameborder=\%%230\%%22", videoID];
[aWebView loadRequest:[NSURLRequest requestWithURL:[NSURL URLWithString:htmlStringToLoad]]];
Please try this,
It works for me, just some lines of code.
- (void)viewDidLoad
{
[super viewDidLoad];
vimeoHelper = [[VimeoHelper alloc] init];
[vimeoHelper getVimeoRedirectUrlWithUrl:#"http://vimeo.com/52760742" delegate:(id)self];
}
- (void)finishedGetVimeoURL:(NSString *)url
{
_moviePlayerController = [[MPMoviePlayerViewController alloc] initWithContentURL:[NSURL URLWithString:url]];
[self presentViewController:_moviePlayerController animated:NO completion:nil];
}
Use Below Code the it will work fine
NSMutableString *html = [[NSMutableString alloc] initWithCapacity:1] ;
[html appendString:#"<html><head>"];
[html appendString:#"<style type=\"text/css\">"];
[html appendString:#"body {"];
[html appendString:#"background-color: transparent;"];
[html appendString:#"color: white;"];
[html appendString:#"}"];
[html appendString:#"</style>"];
[html appendString:#"</head><body style=\"margin:0\">"];
[html appendString:#"<iframe src=\"//player.vimeo.com/video/84403700?autoplay=1&loop=1\" width=\"1024\" height=\"768\" frameborder=\"0\" webkitallowfullscreen mozallowfullscreen allowfullscreen></iframe>"];
[html appendString:#"</body></html>"];
[viewWeb loadHTMLString:html baseURL:urlMovie];
I've used this code:
NSString *embedSr = #"<iframe width=\"304\"height=\"350\" src=\"http://player.vimeo.com/video/... \" frameborder=\"0\" allowfullscreen></iframe>";
[[self WebView] loadHTMLString:embedSr baseURL:nil];
I've tried the universal player, it is successful in device with iOS 5, but failed in iOS 4.2 with iPhone 3G. I don't know why. Here's the link to embed it.
Or you can embed manually from the Vimeo site, click embed, and configure the config as you wish.