AVPlayer Dynamic Volume control - objective-c

How can I change the volume of the AVPlayer Dynamically? I mean, I want to mute the volume every time a button is pressed. the given code seems to change it in compile time only. How to do it during runtime???
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[self myAssetURL] options:nil];
NSArray *audioTracks = [asset tracksWithMediaType:AVMediaTypeAudio];
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVAssetTrack *track in audioTracks) {
AVMutableAudioMixInputParameters *audioInputParams =[AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:0.0 atTime:kCMTimeZero];
[audioInputParams setTrackID:[track trackID]];
[allAudioParams addObject:audioInputParams];
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:asset];
[playerItem setAudioMix:audioZeroMix];
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
self.mPlayer = player;
[mPlayer play];

You can send playerItem new instances of AVMutableAudioMix during playback to change levels dynamically. Just link your button to an action method that creates a new AVMutableAudioMix instance (like you have done above) with the appropriate values, and use playerItem's setAudioMix: method to set the new mix values. (If you're working across methods, don't forget to save a reference to your playerItem instance to access it later.)
(N.B. setAudioMix: isn't mentioned explicitly in the AVPlayerItem docs because it is a synthesized setter for the audioMix property.)

Related

Video sequence in Sprite Kit leads to messy interruptions

I want to create a video sequence in Sprite Kit and use the following code:
#interface VideScreenNode()
#end
#implementation VideScreenNode
- (void)setupVideoSequence
{
AVPlayerItem * intro = [AVPlayerItem playerItemWithURL:[self geturlFromFileName:#"Video1" andType:#"mp4"]];
AVPlayerItem * video1 = [AVPlayerItem playerItemWithURL:[self geturlFromFileName:#"Video2" andType:#"mp4"]];
AVPlayerItem * video2 = [AVPlayerItem playerItemWithURL:[self geturlFromFileName:#"Video3" andType:#"mp4"]];
AVPlayerItem * video3 = [AVPlayerItem playerItemWithURL:[self geturlFromFileName:#"Video4" andType:#"mp4"]];
AVPlayerItem * video4 = [AVPlayerItem playerItemWithURL:[self geturlFromFileName:#"Video5" andType:#"mp4"]];
AVPlayerItem * outro = [AVPlayerItem playerItemWithURL:[self geturlFromFileName:#"Video6" andType:#"mp4"]];
AVQueuePlayer * queuePlayer = [[AVQueuePlayer alloc] initWithItems:#[intro, video1, video2, video3, video4, outro]];
SKVideoNode * sequenceNode = [[SKVideoNode alloc] initWithAVPlayer: queuePlayer];
sequenceNode.position = CGPointMake(512, 384);
[sequenceNode play];
[self addChild:sequenceNode];
}
#pragma mark - helper
- (NSURL *)geturlFromFileName:(NSString *)name
andType:(NSString *)type
{
return [NSURL fileURLWithPath: [[NSBundle mainBundle] pathForResource: name ofType:type]];
}
The video sequence works just fine, but there is a small interruption between each clip. How do I achieve a seamless transition?
Thanks in advance.
You want to add in a call on the AVQueuePlayer that you have. Such as:
[queuePlayer prerollAtRate:float completionHandler:^(BOOL finished)completionHandler];
This will load the files into memory and prepare them to play. This will have an impact on performance, depending on the size of the clips. It should smooth out the transitions that you have. However, you will need to watch out, as if the status property is not AVPlayerStatusReadyToPlay, this will fail.

iOS: AVPlayerItem to AVAsset

I got 2 AVAssets, and I do have change using VideoComposition, and AudioMix for AVPlayerItem. After that, I use asset from AVPlayerItem, but VideoComposition, and AudioMix are not applied.
I want the result asset to be applied by both VideoComposition, and AudioMix.
Here's the code.
+ (AVAsset *)InitAsset:(AVAsset *)asset AtTime:(double)start ToTime:(double)end {
CGFloat colorComponents[4] = {1.0,1.0,1.0,0.0};
//Create AVMutableComposition Object.This object will hold our multiple AVMutableCompositionTrack.
AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init];
//Here we are creating the first AVMutableCompositionTrack.See how we are adding a new track to our AVMutableComposition.
AVMutableCompositionTrack *masterTrack =
[mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
//Now we set the length of the firstTrack equal to the length of the firstAsset and add the firstAsset to out newly created track at kCMTimeZero so video plays from the start of the track.
[masterTrack insertTimeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(start, 1), CMTimeMakeWithSeconds(end, 1))
ofTrack:[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
// Each video layer instruction
AVMutableVideoCompositionLayerInstruction *masterLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:masterTrack];
[masterLayerInstruction setOpacity:1.0f atTime:kCMTimeZero];
[masterLayerInstruction setOpacityRampFromStartOpacity:1.0f
toEndOpacity:0.0
timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
//See how we are creating AVMutableVideoCompositionInstruction object.This object will contain the array of our AVMutableVideoCompositionLayerInstruction objects.You set the duration of the layer.You should add the lenght equal to the lingth of the longer asset in terms of duration.
AVMutableVideoCompositionInstruction * MainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
[MainInstruction setTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
[MainInstruction setLayerInstructions:[NSArray arrayWithObjects:masterLayerInstruction,nil]];
[MainInstruction setBackgroundColor:CGColorCreate(CGColorSpaceCreateDeviceRGB(), colorComponents)];
//Now we create AVMutableVideoComposition object.We can add mutiple AVMutableVideoCompositionInstruction to this object.We have only one AVMutableVideoCompositionInstruction object in our example.You can use multiple AVMutableVideoCompositionInstruction objects to add multiple layers of effects such as fade and transition but make sure that time ranges of the AVMutableVideoCompositionInstruction objects dont overlap.
AVMutableVideoComposition *MainCompositionInst = [AVMutableVideoComposition videoComposition];
MainCompositionInst.instructions = [NSArray arrayWithObject:MainInstruction];
MainCompositionInst.frameDuration = CMTimeMake(1, 30);
MainCompositionInst.renderSize = CGSizeMake(1280, 720);
// [MainCompositionInst setFra]
AVMutableCompositionTrack *masterAudio = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[masterAudio insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))
ofTrack:[[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
// Each Audio
AVMutableAudioMixInputParameters *masterAudioMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:masterAudio];
[masterAudioMix setVolume:1.0f atTime:kCMTimeZero];
[masterAudioMix setVolumeRampFromStartVolume:1.0f
toEndVolume:0.0f
timeRange:CMTimeRangeMake(CMTimeMakeWithSeconds(end, 1), CMTimeMakeWithSeconds(end + ANIMATION_FADE_TIME, 1))];
// [SecondTrackMix setVolume:1.0f atTime:CMTimeMake(2.01, 1)];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithObjects:masterAudioMix,nil];
//Finally just add the newly created AVMutableComposition with multiple tracks to an AVPlayerItem and play it using AVPlayer.
AVPlayerItem *item = [AVPlayerItem playerItemWithAsset:mixComposition];
item.videoComposition = MainCompositionInst;
item.audioMix = audioMix;
return [item asset];
}
Do anyone have any idea ?
Best Regards.
Use AVAssetExportSeesion...
"An AVAssetExportSession object transcodes the contents of an AVAsset source object to create an output of the form described by a specified export preset."
Use AVAssetExportSeesion's properties audioMix and videoComposition.
audioMix
Indicates whether non-default audio mixing is enabled for export, and supplies the parameters for audio mixing.
#property(nonatomic, copy) AVAudioMix *audioMix
videoComposition
Indicates whether video composition is enabled for export, and supplies the instructions for video composition.
#property(nonatomic, copy) AVVideoComposition *videoComposition

AVMutableComposition Black Screen on Replay After 3rd clip

In my app I am attempting to merge video clips dynamically. I have two properties which hold the previous recording and then the next recording to be appended to the end. This is basically acting as a pause function, and then the user can play it back on the fly.
I make the app write the first clip to "video.mp4" in the docs directory. This is set as the previous recording. Then I write the next clip to "video2.mp4", set it to the next recording, and merge them together using AVMutableComposition:
AVMutableComposition *mashVideosTogether = [AVMutableComposition composition];
NSError *error;
if([mashVideosTogether insertTimeRange:
CMTimeRangeMake(kCMTimeZero, [self.previousRecording duration])
ofAsset:self.previousRecording
atTime:kCMTimeZero
error:&error]) NSLog(#"Successfully added one");
else NSLog(#"error: %#", error.description);
if([mashVideosTogether insertTimeRange:
CMTimeRangeMake(kCMTimeZero, [self.nextRecording duration])
ofAsset:self.nextRecording
atTime:CMTimeAdd(kCMTimeZero, [self.previousRecording duration])
error:&error]) NSLog(#"Success on 2");
else NSLog(#"error: %#", error.description);
This appends the first and second videos. I then export the video to "combined.mp4", and when this is successfully finished I then delete the file at "video.mp4", and export the combined video to "video.mp4" (so at this point the combined video exists in two places). This plays fine in my player. If the user clicks record again, the newly combined video at "video.mp4" is set as the previous recording, and the newly recorded clip is set as next recording, and the whole process is repeated. They are appended and exported all over again to repeat the process.
However, once I add a third (or more) clip, the first clips from the created composition go black in the playback. Their duration is still kept, but there is no video or sound. Basically, anytime I create a new composition from an old composition, the first composition is presented blank, and then the only thing that is preserved is their duration, and the new recording. Is this data lost when the composition is then made into another composition? Do I need to manually add them as tracks? Any help is appreciated!
SOLVED
I read over Apple's AVEditDemo and it seems as if my original assumption was correct - when I was appending videos together using AVMutableComposition alone (AKA not creating individual track files and merging them), the data for these tracks was lost when they were added to another composition.
So I just created individual tracks for audio and video of every clip to merge them and now I have a working setup where I can shoot videos dynamically, stop, then begin shooting again and they will be concatenated on the fly.
if(self.previousRecording && self.nextRecording) {
NSArray *assetArray = [NSArray arrayWithObjects:
self.previousRecording, self.nextRecording, nil];
NSURL *fileURL = [self getURLWithPathComponent:#"combined.mp4"];
AVMutableComposition *mashVideosTogether = [AVMutableComposition composition];
NSError *error;
CMTime nextClipStartTime = kCMTimeZero;
AVMutableCompositionTrack *compositionVideoTrack =
[mashVideosTogether addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
AVMutableCompositionTrack *compositionAudioTrack =
[mashVideosTogether addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
for(int i=0; i<[assetArray count]; i++) {
AVURLAsset *asset = [assetArray objectAtIndex:i];
CMTimeRange timeRangeInAsset = CMTimeRangeMake(kCMTimeZero, [asset duration]);
AVAssetTrack *clipVideoTrack =
[[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[compositionVideoTrack insertTimeRange:timeRangeInAsset
ofTrack:clipVideoTrack atTime:nextClipStartTime error:nil];
AVAssetTrack *clipAudioTrack =
[[asset tracksWithMediaType:AVMediaTypeAudio]objectAtIndex:0];
[compositionAudioTrack insertTimeRange:timeRangeInAsset
ofTrack:clipAudioTrack atTime:nextClipStartTime error:nil];
nextClipStartTime = CMTimeAdd(nextClipStartTime, timeRangeInAsset.duration);
}
//do exports down here and then reset previous recording, etc.
}

Compositing 2 videos on top of each other with alpha

AVFoundation allows you to "compose" 2 assets (2 videos) as 2 "tracks", just like in Final Cut Pro, for example.
The theory says I can have 2 videos on top of each other, with alpha, and see both.
Either I'm doing something wrong, or there's a bug somewhere, because the following test code, although a bit messy, clearly states I should see 2 videos, and I only see one, as seen here: http://lockerz.com/s/172403384 -- the "blue" square is IMG_1388.m4v
For whatever reason, IMG_1383.MOV is never shown.
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// Track B
NSURL *urlVideo2 = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/IMG_1388.m4v"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];
// Track A
NSURL *urlVideo = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/IMG_1383.MOV"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
[from setOpacity:.5 atTime:kCMTimeZero];
// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition, nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = CGSizeMake(480, 360);
// Export
NSURL *outputURL = [NSURL URLWithString:#"file://localhost/Users/me/Movies/Temp/export.MOV"];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:[[composition copy] autorelease] presetName:AVAssetExportPresetHighestQuality];
[exportSession setOutputFileType:#"com.apple.quicktime-movie"];
exportSession.outputURL = outputURL;
exportSession.videoComposition = videoComposition;
[exportSession exportAsynchronouslyWithCompletionHandler:nil];
// Player
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:composition];
playerItem.videoComposition = videoComposition;
AVPlayer *player = [AVPlayer playerWithPlayerItem:playerItem];
AVPlayerLayer *playerLayer = [AVPlayerLayer playerLayerWithPlayer:player];
Are you seeing something wrong?
The "goal" of this code is to "record" the camera input (video 1) and the opengl output (video 2). I also tried to "compose" them "directly" with Buffers and all that, but I was as well unsuccessful :( Turns out AVFoundation is way less trivial than I thought.
It looks good, except this part:
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoAssetTrack2];
You need to use videoTrack and videoTrack2 to build the layer instructions, i.e., the tracks added to composition, instead of the original assets videoAssetTrack and videoAssetTrack2.
Also, adding a transformation to rotate the video it's a bit trickier (like anything in AVFoundation beyond the basics).
I've just commented out the line to make it play the 2 videos.
This is your code with the modifications:
NSError *error = nil;
NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithBool:YES], AVURLAssetPreferPreciseDurationAndTimingKey, nil];
AVMutableComposition *composition = [AVMutableComposition composition];
CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMakeWithSeconds(4, 1));
AVMutableVideoComposition *videoComposition = [AVMutableVideoComposition videoComposition];
// Track B
NSURL *urlVideo2 = [[NSBundle mainBundle] URLForResource:#"b" withExtension:#"mov"];
AVAsset *video2 = [AVURLAsset URLAssetWithURL:urlVideo2 options:options];
AVMutableCompositionTrack *videoTrack2 = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:0];
NSArray *videoAssetTracks2 = [video2 tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
[videoTrack2 insertTimeRange:timeRange ofTrack:videoAssetTrack2 atTime:kCMTimeZero error:&error];
AVMutableVideoCompositionLayerInstruction *to = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack2];
[to setOpacity:.5 atTime:kCMTimeZero];
//[to setTransform:CGAffineTransformScale(videoAssetTrack2.preferredTransform, .5, .5) atTime:kCMTimeZero];
// Track A
NSURL *urlVideo = [[NSBundle mainBundle] URLForResource:#"a" withExtension:#"mov"];
AVURLAsset *video = [AVURLAsset URLAssetWithURL:urlVideo options:options];
AVMutableCompositionTrack *videoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:1];
NSArray *videoAssetTracks = [video tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack *videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
[videoTrack insertTimeRange:timeRange ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil];
AVMutableVideoCompositionLayerInstruction *from = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
[from setOpacity:.5 atTime:kCMTimeZero];
// Video Compostion
AVMutableVideoCompositionInstruction *transition = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
transition.backgroundColor = [[UIColor clearColor] CGColor];
transition.timeRange = timeRange;
transition.layerInstructions = [NSArray arrayWithObjects:to, from, nil];
videoComposition.instructions = [NSArray arrayWithObjects:transition, nil];
videoComposition.frameDuration = CMTimeMake(1, 30);
videoComposition.renderSize = composition.naturalSize; // CGSizeMake(480, 360);
I think you've got it wrong.
A video file may have multiple stream of data. For example, if it's a video with sound the file will have 2 streams, the Audio stream and the video stream. Another example is an audio surround video file which may include 5 or more audio stream and 1 video stream.
As with audio, most video file container format (mov, mp4, etc...) support multiple streams of video in 1 file but in fact this doesn't mean that the streams will have any relation to each other, they are just stored on the same file container. If you will open such file with QuickTime for example, you will get as many windows as video streams on such file.
Anyhow, the video streams will not get 'mix' this way.
What you're trying to achieve is related to signal processing of the video stream, and I really recommend you reading more about it.
If you don't really need to 'mix' the video data together to a file, you might want to displaying both video files on each other using MPMediaPlayers. Keep in mind that dealing with video data is usually a CPU intensive problem which you might (sometime) wont be able to solve using now days iOS devices.

Change video url for MPMoviePlayerController instance rather than allocating new one

I have an MPMoviePlayerController named myMoviePlayer; I allocate and initialize it when my app loads:
NSString *moviePath = [bundle pathForResource:[movieName uppercaseString] ofType:#"mov" inDirectory:#"Videos"];
if(moviePath)
{
NSURL *movieURL = [NSURL fileURLWithPath:moviePath];
myMoviePlayer=[[MPMoviePlayerController alloc] initWithContentURL:movieURL];
[**myUI.view** setFrame:CGRectMake(80, 80, 600, 350)];
[self.view addSubview:myMoviePlayer.view];
myMoviePlayer.shouldAutoplay=NO;
}
There are two views in my app named imageView and videoView. I need to hide myMoviePlayer in imageView and display it again when my UI view is videoView.
Each time I show a movie, movieName will be different.
Right now, I am allocating and initializing myVideoPlayer each time my view changes to the movie view. Is it possible to set a new video url to myMoviePlayer without allocating it again?
Yes there is:
[myMoviePlayer setContentURL:[NSURL URLWithString:aMovieUrl]];
Just set the contentURL property of the MPMoviePlayerController instance.
Sharmain i got your problem...
you need to set the contentURL and then call Play method of mpmovieplayercontroller:
[myPlayer setContentURL:xyz];
[myPlayer play];
enjoy..!!
NSString *path = [[NSBundle mainBundle] pathForResource:#"myVideo" ofType:#"mp4"];
self.myPlayer = [[MPMoviePlayerController alloc] init];
self.myPlayer.view.frame = CGRectMake(0, 124, 768, 900);
self.myPlayer.shouldAutoplay = YES;
self.myPlayer.controlStyle = MPMovieControlStyleNone;
self.myPlayer.repeatMode = MPMovieRepeatModeOne;
self.myPlayer.fullscreen = YES;
self.myPlayer.movieSourceType = MPMovieSourceTypeFile;
self.myPlayer.scalingMode = MPMovieScalingModeAspectFit;
[self.view addSubview:myPlayer.view];
[myPlayer setContentURL:[NSURL fileURLWithPath:path]];
[myPlayer play];