AVAssetTrack Combine video not okay in iOS 7 - objective-c

I can successfully combine 2 video into 1 in iOS 6. But I don't know what happen in iOS 7. I got array error like this.
Terminating app due to uncaught exception 'NSRangeException', reason: '* -[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array'
This is part of the code I write. I have shown where error in the code. How should I do?
NSMutableArray *array = [[NSMutableArray alloc]init ];
for(int kk=0;kk < trackRecordingVideoName+1; kk++)
{
[array addObject:[[self userPath] stringByAppendingPathComponent:[NSString stringWithFormat:#"%#%d.mp4",recordingVideoName,kk]]];
NSLog(#"%d is added",kk);
}
videoPathArray= [[NSArray alloc]initWithArray:array];
[videoPathArray retain];
AVMutableComposition *composition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionVideoTrack = [composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
CMTime startTime = kCMTimeZero;
NSLog(#"videoPathArray.count is %d",videoPathArray.count);
for (NSInteger i=0; i < videoPathArray.count; i++) {
NSLog(#"For loop now is %d and name is %#",i,[videoPathArray objectAtIndex:i]);
NSString *path = (NSString*)[videoPathArray objectAtIndex:i];
NSURL *url = [[NSURL alloc] initFileURLWithPath:path];
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:url options:nil];
[url release];
//*************************************
AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];//This is the error
//*************************************
if(i == 0)
{
[compositionVideoTrack setPreferredTransform:videoTrack.preferredTransform];
}
Boolean ok = [compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, [asset duration]) ofTrack:videoTrack atTime:startTime error:nil];
if(ok)
{NSLog(#"can combine in for loop");}
else{NSLog(#"cannot combine in for loop");}
startTime = CMTimeAdd(startTime, [asset duration]);
}

Check the url if its wright, it should point to a real video file, your AVURLAsset don't contain any video media type.

Related

AVAsset it not working with HLS Streaming file

I try to play a m3u8 file and a mp3 file simultaneously:
NSURL *audioURL = [NSURL URLWithString:#"https://XXX.de/XXX.mp3"];
AVAsset *audioAsset = [AVAsset assetWithURL:audioURL];
NSURL *videoURL = [NSURL URLWithString:#"https://XXX.de/XXX.m3u8"];
AVAsset *videoAsset = [AVAsset assetWithURL:videoURL];
NSError *error;
AVMutableComposition* mixAsset = [[AVMutableComposition alloc] init];
AVMutableCompositionTrack* audioTrack = [mixAsset addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, audioAsset.duration) ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error: &error];
AVMutableCompositionTrack* videoTrack = [mixAsset addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error: &error];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset:mixAsset];
movie = [AVPlayer playerWithPlayerItem:playerItem];
but it is not working. I get the following error:
*** -[__NSArrayM objectAtIndex:]: index 0 beyond bounds for empty array
With a mp4-Files it is working but not with m3u8 file.
The apple docs say that avasset works with local or remote URLs:
https://developer.apple.com/documentation/avfoundation/avasset?language=objc
i wonder if m3u8 is basically not working.

How to merge audio in our video file

I have an application in which I need to merge an audio file in to the video recorded by AVCapture session, so that both audio of recorded movie and merged audio can be heard.
I am able to merge the audio to video with avcomposition and it does fine . But the problem is that the original audio file can not be heard. Here is my code.
NSString *resourceAudioName = [NSString stringWithFormat:#"%#_audio",getTitle];
NSURL *audio_inputFileUrl = [[NSBundle mainBundle] URLForResource:resourceAudioName withExtension:#"mp3"];
NSString * video_inputFilePath = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0];
video_inputFilePath = [video_inputFilePath stringByAppendingPathComponent:#"movie1.mp4"];
self.outputFilePath = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:#"/Documents/OutPutMovie-%#.mp4",[NSDate date]]];
NSURL * outputFileUrl = [NSURL fileURLWithPath:self.outputFilePath];
if (audio_inputFileUrl) {
AVMutableComposition* mixComposition = [AVMutableComposition composition];
NSURL * video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath];
CMTime nextClipStartTime = kCMTimeZero;
AVURLAsset * videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
NSArray * videoAssetTracks2 = [videoAsset tracksWithMediaType:AVMediaTypeVideo];
AVAssetTrack * videoAssetTrack2 = ([videoAssetTracks2 count] > 0 ? [videoAssetTracks2 objectAtIndex:0] : nil);
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
AVMutableCompositionTrack * a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:videoAssetTrack2 atTime:nextClipStartTime error:nil];
AVURLAsset * audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
NSArray * videoAssetTracks = [audioAsset tracksWithMediaType:AVMediaTypeAudio];
AVAssetTrack * videoAssetTrack = ([videoAssetTracks count] > 0 ? [videoAssetTracks objectAtIndex:0] : nil);
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
AVMutableCompositionTrack * b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:videoAssetTrack atTime:nextClipStartTime error:nil];
AVAssetExportSession * _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = #"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
if (AVAssetExportSessionStatusCompleted == _assetExport.status) {
[videoAsset release];
[audioAsset release];
[_assetExport release];
[self performSelectorOnMainThread:#selector(moveNextView) withObject:nil waitUntilDone:YES];
}
}
];
-(void)mergeAndSave
{
//Create AVMutableComposition Object which will hold our multiple AVMutableCompositionTrack or we can say it will hold our video and audio files.
AVMutableComposition* mixComposition = [AVMutableComposition composition];
//Now first load your audio file using AVURLAsset. Make sure you give the correct path of your videos.
NSURL *audio_url = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"kick" ofType:#"mp3"]];
self.audioAsset = [[AVURLAsset alloc]initWithURL:audio_url options:nil];
CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, self.audioAsset.duration);
//Now we are creating the first AVMutableCompositionTrack containing our audio and add it to our AVMutableComposition object.
AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[self.audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil];
//Now we will load video file.
NSURL *video_url = mediaUrl;
self.videoAsset = [[AVURLAsset alloc]initWithURL:video_url options:nil];
CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,self.audioAsset.duration);
//Now we are creating the second AVMutableCompositionTrack containing our video and add it to our AVMutableComposition object.
AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil];
//decide the path where you want to store the final video created with audio and video merge.
NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:#"video.mov"]];
NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
[[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];
//Now create an AVAssetExportSession object that will save your final video at specified path.
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
_assetExport.outputFileType = #"com.apple.quicktime-movie";
_assetExport.outputURL = outputFileUrl;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
dispatch_async(dispatch_get_main_queue(), ^{
[self exportDidFinish:_assetExport];
});
}
];
}
- (void)exportDidFinish:(AVAssetExportSession*)session
{
if(session.status == AVAssetExportSessionStatusCompleted){
NSURL *outputURL = session.outputURL;
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {
[library writeVideoAtPathToSavedPhotosAlbum:outputURL
completionBlock:^(NSURL *assetURL, NSError *error){
dispatch_async(dispatch_get_main_queue(), ^{
if (error) {
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:#"Video Saving Failed" delegate:nil cancelButtonTitle:#"Ok" otherButtonTitles: nil, nil];
[alert show];
}else{
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Video Saved" message:#"Saved To Photo Album" delegate:self cancelButtonTitle:#"Ok" otherButtonTitles: nil];
[alert show];
//[self loadMoviePlayer:outputURL];
}
});
}];
}
}
self.audioAsset = nil;
self.videoAsset = nil;
//[activityView stopAnimating];
//[activityView setHidden:YES];
}
try this
I believe you have to use AVMutableAudioMix in order to do a mixing of more than one audio. With your approach only the audioAsset gets added to the composition. There is a video about this in WWDC 2010, which explains how to do this. I have tried to implement without success here. Hopefully someone will help us fix it.

Unable to run the code on device

I am trying to merge videos and the code is running fine on simulator. I am able to merge videos,
but when I run the same code on device it gives me this exception
Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '* -[__NSArrayM insertObject:atIndex:]: object cannot be nil
the code is here :
AVMutableComposition *mixComposition = [AVMutableComposition composition]; AVMutableCompositionTrack *compositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; NSError * error = nil; NSMutableArray * timeRanges = [NSMutableArray arrayWithCapacity:videoClipPaths.count]; NSMutableArray * tracks = [NSMutableArray arrayWithCapacity:videoClipPaths.count]; for (int i=0; i<[videoClipPaths count]; i++) { AVURLAsset *assetClip = [AVURLAsset URLAssetWithURL:[videoClipPaths objectAtIndex:i] options:nil]; AVAssetTrack *clipVideoTrackB = [[assetClip tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[timeRanges addObject:[NSValue valueWithCMTimeRange:CMTimeRangeMake(kCMTimeZero, assetClip.duration)]];
[tracks addObject:clipVideoTrackB];
}
NSLog(#"HELLO: %#", timeRanges );
[compositionTrack insertTimeRanges:timeRanges ofTracks:tracks atTime:kCMTimeZero error:&error];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
// NSParameterAssert(exporter != nil);
NSArray *t; NSString *u;
t = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
u = [t objectAtIndex:0];
NSString *finalPath = [u stringByAppendingPathComponent:#"final.mov"];
NSURL *lastURL = [NSURL fileURLWithPath:finalPath];
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.outputURL = lastURL;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"exporting failed");
[SVProgressHUD dismiss];
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"exporting completed");
UISaveVideoAtPathToSavedPhotosAlbum(finalPath, self, nil, NULL);
[SVProgressHUD dismiss];
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"export cancelled");
break;
}
}];
Most likely your app is using some file that is resident on your Mac but you have not installed and properly addressed on the phone.
But, in any event, log "tracks" as well as well as "timeRanges".

Concatenating two videos in ios

I am trying to merge two videos, but it always throws this exception:
-[NSURL tracksWithMediaType:]: unrecognized selector sent to instance 0x935cf10
2012-08-09 16:26:59.492 videoTest[3920:17903] Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[NSURL tracksWithMediaType:]:
Here is the code:
AVMutableComposition *mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *compositionTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
NSError * error = nil;
NSMutableArray * timeRanges = [NSMutableArray arrayWithCapacity:videoClipPaths.count];
NSMutableArray * tracks = [NSMutableArray arrayWithCapacity:videoClipPaths.count];
for (int i=0; i<[videoClipPaths count]; i++) {
AVURLAsset *assetClip = [videoClipPaths objectAtIndex:i];
AVAssetTrack *clipVideoTrackB = [[assetClip tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
[timeRanges addObject:[NSValue valueWithCMTimeRange:CMTimeRangeMake(kCMTimeZero, assetClip.duration)]];
[tracks addObject:clipVideoTrackB];
}
[compositionTrack insertTimeRanges:timeRanges ofTracks:tracks atTime:kCMTimeZero error:&error];
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720];
NSParameterAssert(exporter != nil);
NSArray *t;
NSString *u;
t = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
u = [t objectAtIndex:0];
NSString *finalPath = [u stringByAppendingPathComponent:#"final.mov"];
NSURL *lastURL = [NSURL fileURLWithPath:finalPath];
exporter.outputFileType = AVFileTypeQuickTimeMovie;
exporter.outputURL = lastURL;
[exporter exportAsynchronouslyWithCompletionHandler:^(void){
switch (exporter.status) {
case AVAssetExportSessionStatusFailed:
NSLog(#"exporting failed");
break;
case AVAssetExportSessionStatusCompleted:
NSLog(#"exporting completed");
//UISaveVideoAtPathToSavedPhotosAlbum(filePath, self, nil, NULL);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"export cancelled");
break;
}
}];
I fixed it by replacing this code:
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPreset1280x720]; NSParameterAssert(exporter != nil);
with:
AVAssetExportSession *exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
assetClip is an AVURLAsset. But it looks like you're assigning a NSURL object to it. Then you call tracksWithMediaType on it, which is a method NSURL doesn't have. That's why you're getting "unrecognized selector".
I'm not very familiar with this particular family of classes, but it might fix your problem if you replace
AVURLAsset *assetClip = [videoClipPaths objectAtIndex:i];
with
AVURLAsset *assetClip = [AVURLAsset URLAssetWithURL:[videoClipPaths objectAtIndex:i] options:nil];

Integrate a selected music file into video

I am developing a video recording application and I would like to be able to integrate a music file selected by the user from the iPod library. Please share your inputs as to how I can achieve this requirement. Sample code is helpful.
Finally succeeded Integrate a selected music file into video
Using AVAssetExportSession we can merge Video and audio together using AVMutableComposition.
Thanks for update all of you!!
//This method merges the audio and video.
- (void)mergeAudioAtUrl:(NSURL *)audioUrl withVideoAtUrl:(NSURL *)videoUrl toUrl:(NSURL *)outputUrl
{
//_imageCaptureCount = [_imagesArray count]*100;
AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audioUrl options:nil];
AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:videoUrl options:nil];
AVMutableComposition* mixComposition = [AVMutableComposition composition];
if([[audioAsset tracksWithMediaType:AVMediaTypeAudio] count])
{
AVMutableCompositionTrack *compositionCommentaryTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionCommentaryTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
}
AVMutableCompositionTrack *compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
preferredTrackID:kCMPersistentTrackID_Invalid];
[compositionVideoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, videoAsset.duration)
ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
atTime:kCMTimeZero error:nil];
[audioAsset release];
[videoAsset release];
AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition
presetName:AVAssetExportPresetPassthrough];
NSURL *exportUrl = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/Documents/%#Video.mp4", NSHomeDirectory(),pcNameString]];
if ([[NSFileManager defaultManager] fileExistsAtPath:[NSString stringWithFormat:#"%#/Documents/%#Video.mp4", NSHomeDirectory(),pcNameString]])
{
[[NSFileManager defaultManager] removeItemAtPath:[NSString stringWithFormat:#"%#/Documents/%#Video.mp4", NSHomeDirectory(),pcNameString] error:nil];
}
_assetExport.outputFileType = #"com.apple.quicktime-movie"; //com.apple.m4v-video
_assetExport.outputURL = exportUrl;
_assetExport.shouldOptimizeForNetworkUse = YES;
[_assetExport exportAsynchronouslyWithCompletionHandler:
^(void ) {
// your completion code here
// NSLog(#"completed");
removeProgresBarFlag = YES;
/* NSString* savedVideoFilePath = [NSString stringWithFormat:#"%#/Documents/PC%d.mp4", NSHomeDirectory(),[videosListArray count]];
if(UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(savedVideoFilePath))
{
[[UIApplication sharedApplication]beginIgnoringInteractionEvents];
UISaveVideoAtPathToSavedPhotosAlbum(savedVideoFilePath, self, nil, nil);
[[UIApplication sharedApplication]endIgnoringInteractionEvents];
} */
NSMutableDictionary* videoDetailDict = [[NSMutableDictionary alloc] initWithCapacity:0];
NSStringEncoding encoding;
NSError* error;
NSString * persistentID;
NSFileManager* fileManager = [NSFileManager defaultManager];
if([fileManager fileExistsAtPath:PRESENTSONGS_FILE_PATH])
persistentID = [NSString stringWithContentsOfFile:PRESENTSONGS_FILE_PATH usedEncoding:&encoding error:&error];
else
persistentID = #"";
[videoDetailDict setObject:persistentID forKey:KSong];
if([fileManager fileExistsAtPath:PRESENTIMAGES_FILE])
{
NSMutableArray* currentImagesArray = [[NSMutableArray alloc] initWithContentsOfFile:IMAGESDATA_FILE_PATH];
NSMutableArray* durationArray = [[NSMutableArray alloc] initWithContentsOfFile:[NSString stringWithFormat:#"%#/videoduration.plist", [[NSBundle mainBundle] resourcePath]]];
[videoDetailDict setObject:[durationArray objectAtIndex:[currentImagesArray count]-1] forKey:KfileSize];
[durationArray release];
[videoDetailDict setObject:currentImagesArray forKey:KImagesList];
if([fileManager fileExistsAtPath:TEMPVIDEO_FILE_PATH])
[fileManager removeItemAtPath:TEMPVIDEO_FILE_PATH error:nil];
NSString* mainPath;
mainPath = [NSString stringWithFormat:#"%#/Documents/%#File",NSHomeDirectory(),pcNameString];
if([fileManager fileExistsAtPath:mainPath])
[fileManager removeItemAtPath:mainPath error:nil];
[fileManager createDirectoryAtPath:mainPath withIntermediateDirectories:NO attributes:nil error:nil];
for(int i=0;i<[currentImagesArray count];i++)
{
[fileManager copyItemAtPath:[NSString stringWithFormat:#"%#%#",PRESENTIMAGES_FILE,[currentImagesArray objectAtIndex:i]] toPath:[NSString stringWithFormat:#"%#/%#",mainPath,[currentImagesArray objectAtIndex:i]] error:nil];
}
[currentImagesArray release];
}
if([fileManager fileExistsAtPath:KMESSAGE_FILEPATH])
{
NSMutableDictionary* currentMessageDictioanry = [[NSMutableDictionary alloc] initWithContentsOfFile:KMESSAGE_FILEPATH];
[videoDetailDict setObject:currentMessageDictioanry forKey:Kmessage];
[currentMessageDictioanry release];
}
[videoDetailDict setObject:pcNameString forKey:KPostCardName]; //[NSString stringWithFormat:#"PostCard Video%d",[videosListArray count]]
//[videosListArray insertObject:videoDetailDict atIndex:0];
[videosListArray addObject:videoDetailDict];
[videoDetailDict release];
[videosListArray writeToFile:VIDEOS_FILE_PATH atomically:YES];
}
];
}