getting progress information constantly with AVAssetSessionExport - objective-c

I have an issue where I'm using AVAssetSessionExport to do some video conversions.
I've been trying to get the progress information on a UIProgressView but i can't seem to achieve that with this set of code. Could I get suggestions on how I might be able to achieve this?
(look at the commented code, that's how i'm updating the progress bar, but it doesn't work well)
NSArray *compatiblePresets = [AVAssetExportSession exportPresetsCompatibleWithAsset:composition];
if ([compatiblePresets containsObject:AVAssetExportPresetHighestQuality]) {
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
initWithAsset:composition presetName:AVAssetExportPresetHighestQuality];
NSIndexPath *indexPath = [NSIndexPath indexPathForRow:0 inSection:1];
UITableViewCell *cell = (UITableViewCell *)[(UITableView *)self.view cellForRowAtIndexPath:indexPath];
// UIProgressView *prog = [[UIProgressView alloc] initWithProgressViewStyle:UIProgressViewStyleBar];
// [cell.contentView addSubview:prog];
//
exportSession.outputURL = [NSURL fileURLWithPath:[[ShowDAO userDocumentDirectory] stringByAppendingString:exportFilename]];
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(0, 1);
CMTime duration = CMTimeMakeWithSeconds(1000, 1);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
switch ([exportSession status]) {
case AVAssetExportSessionStatusCompleted:
NSLog(#"Export Completed");
[DSBezelActivityView removeViewAnimated:YES];
//delete unused video file
[[NSFileManager defaultManager] removeItemAtPath: [[ShowDAO userDocumentDirectory] stringByAppendingString:videoFilename] error: NULL];
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Export failed: %#", [[exportSession error] localizedDescription]);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Export cancelled");
break;
default:
break;
}
}];
// while (exportSession.progress <=0.1){
// NSLog(#"prog : %f",exportSession.progress);
// [prog setProgress:exportSession.progress];
//// sleep(1);
// }
[exportSession release];

You're blocking the main thread with the commented code. You should make the exportSession a property of the class you're doing this in. Then I'd suggest having an NSTimer which calls a class method periodically which then updates the progressView until the export session is done. I hope this is clear, if not I can provide some code.

This is what i am doing.
BOOL goOnFlag = YES;
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
int count = 0;
while(fabs(avAssetExportSession.progress - 1.0)>0.01&&goOnFlag){
NSLog(#"loading... : %f",avAssetExportSession.progress);
sleep(1);
if(avAssetExportSession.progress==0)
count++;
if(count>8){
[self async_main:^{
[avAssetExportSession cancelExport];
goOnFlag = NO;
NSLog(#"save failed");
}];
}
}
});

Related

Simple Speech into Text in IPhone

I am trying to developing the app like speech into text , i want to convert the speech into a text in textfield .I have searched in google i got some sample code but that is not useful.i got these link,raywenderlinch,but in this they mentioned some API is used for speech reconginization ,but i cant able to get that.
Please anyone can share a tutorial with the sample project it might be very useful to me .
Thanks in advance !!!
- (void) viewDidAppear:(BOOL)animated {
_recognizer = [[SFSpeechRecognizer alloc] initWithLocale:[NSLocale localeWithLocaleIdentifier:#"en-US"]];
[_recognizer setDelegate:self];
[SFSpeechRecognizer requestAuthorization:^(SFSpeechRecognizerAuthorizationStatus authStatus) {
switch (authStatus) {
case SFSpeechRecognizerAuthorizationStatusAuthorized:
//User gave access to speech recognition
NSLog(#"Authorized");
break;
case SFSpeechRecognizerAuthorizationStatusDenied:
//User denied access to speech recognition
NSLog(#"SFSpeechRecognizerAuthorizationStatusDenied");
break;
case SFSpeechRecognizerAuthorizationStatusRestricted:
//Speech recognition restricted on this device
NSLog(#"SFSpeechRecognizerAuthorizationStatusRestricted");
break;
case SFSpeechRecognizerAuthorizationStatusNotDetermined:
//Speech recognition not yet authorized
break;
default:
NSLog(#"Default");
break;
}
}];
audioEngine = [[AVAudioEngine alloc] init];
_speechSynthesizer = [[AVSpeechSynthesizer alloc] init];
[_speechSynthesizer setDelegate:self];
}
-(void)startRecording
{
[self clearLogs:nil];
NSError * outError;
AVAudioSession *audioSession = [AVAudioSession sharedInstance];
[audioSession setCategory:AVAudioSessionCategoryRecord error:&outError];
[audioSession setMode:AVAudioSessionModeMeasurement error:&outError];
[audioSession setActive:true withOptions:AVAudioSessionSetActiveOptionNotifyOthersOnDeactivation error:&outError];
request2 = [[SFSpeechAudioBufferRecognitionRequest alloc] init];
inputNode = [audioEngine inputNode];
if (request2 == nil) {
NSLog(#"Unable to created a SFSpeechAudioBufferRecognitionRequest object");
}
if (inputNode == nil) {
NSLog(#"Unable to created a inputNode object");
}
request2.shouldReportPartialResults = true;
_currentTask = [_recognizer recognitionTaskWithRequest:request2
delegate:self];
[inputNode installTapOnBus:0 bufferSize:4096 format:[inputNode outputFormatForBus:0] block:^(AVAudioPCMBuffer *buffer, AVAudioTime *when){
NSLog(#"Block tap!");
[request2 appendAudioPCMBuffer:buffer];
}];
[audioEngine prepare];
[audioEngine startAndReturnError:&outError];
NSLog(#"Error %#", outError);
}
- (void)speechRecognitionTask:(SFSpeechRecognitionTask *)task didFinishRecognition:(SFSpeechRecognitionResult *)result {
NSLog(#"speechRecognitionTask:(SFSpeechRecognitionTask *)task didFinishRecognition");
NSString * translatedString = [[[result bestTranscription] formattedString] stringByTrimmingCharactersInSet:[NSCharacterSet whitespaceAndNewlineCharacterSet]];
[self log:translatedString];
if ([result isFinal]) {
[audioEngine stop];
[inputNode removeTapOnBus:0];
_currentTask = nil;
request2 = nil;
}
}

Screen recording from a GLKView

I'm trying to record the screen of the user action from a GLKView, my video file is here, I have the correct length but it's show only a black screen.
I've subclassed GLKView, added a pan gesture recogniser on it, and whenever the user do something I draw points on my View (more complicated than that but you got it).
Here is how I initialise my video
NSError *error = nil;
NSURL *url = [NSURL fileURLWithPath:#"/Users/Dimillian/Documents/DEV/movie.mp4"];
[[NSFileManager defaultManager]removeItemAtURL:url error:nil];
self.assetWriter = [[AVAssetWriter alloc] initWithURL:url fileType:AVFileTypeAppleM4V error:&error];
if (error != nil)
{
NSLog(#"Error: %#", error);
}
NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264 forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt: 954] forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt: 608] forKey: AVVideoHeightKey];
self.assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
self.assetWriterVideoInput.expectsMediaDataInRealTime = YES;
// You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
[NSNumber numberWithInt:954], kCVPixelBufferWidthKey,
[NSNumber numberWithInt:608], kCVPixelBufferHeightKey,
nil];
self.assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:
self.assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
[self.assetWriter addInput:self.assetWriterVideoInput];
self.startTime = [NSDate date];
self.lastTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);
[self.assetWriter startWriting];
[self.assetWriter startSessionAtSourceTime:kCMTimeZero];
}
Now here is a short version of my recogniser
- (void)pan:(UIPanGestureRecognizer *)p {
// Prepare vertex to be added on screen according to user input
[self setNeedsDisplay];
}
Now here is my drawrect method
- (void)drawRect:(CGRect)rect
{
glClearColor(1, 1, 1, 1.0f);
glClear(GL_COLOR_BUFFER_BIT);
[effect prepareToDraw];
//removed code about vertex drawing
[self capturePixels];
}
And finally my capturePixels function
- (void)capturePixels
{
glFinish();
CVPixelBufferRef pixel_buffer = NULL;
CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, self.assetWriterPixelBufferInput.pixelBufferPool, &pixel_buffer);
if ((pixel_buffer == NULL) || (status != kCVReturnSuccess))
{
NSLog(#"%d", status);
NSLog(#"VIDEO FAILED");
return;
}
else
{
CVPixelBufferLockBaseAddress(pixel_buffer, 0);
glReadPixels(0, 0, 954, 608, GL_RGBA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(pixel_buffer));
}
// May need to add a check here, because if two consecutive times with the same value are added to the movie, it aborts recording
CMTime currentTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);
if(![self.assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:currentTime])
{
NSLog(#"Problem appending pixel buffer at time: %lld", currentTime.value);
}
else
{
NSLog(#"%#", pixel_buffer);
NSLog(#"Recorded pixel buffer at time: %lld", currentTime.value);
self.lastTime = currentTime;
}
CVPixelBufferUnlockBaseAddress(pixel_buffer, 0);
CVPixelBufferRelease(pixel_buffer);
}
I have another function to close the video input.
- (void)tearDownGL
{
NSLog(#"Tear down");
[self.assetWriterVideoInput markAsFinished];
[self.assetWriter endSessionAtSourceTime:self.lastTime];
[self.assetWriter finishWritingWithCompletionHandler:^{
NSLog(#"finish video");
}];
[EAGLContext setCurrentContext:context];
glDeleteBuffers(1, &vertexBuffer);
glDeleteVertexArraysOES(1, &vertexArray);
effect = nil;
glFinish();
if ([EAGLContext currentContext] == context) {
[EAGLContext setCurrentContext:nil];
}
context = nil;
}
Which seems to works as I have no error, and at the end the video have the correct length, but it's only black...
I'm nowhere an expert in OpenGL, it's only a tiny part of my iOS application, I want to learn it, I'm doing my best, and thanks from the posts from #BradLarson (OpenGL ES 2.0 to Video on iPad/iPhone) I've been able to make progress, but I'm really stuck now.

AVAssetWriter sometimes fails with status AVAssetWriterStatusFailed. Seems random

I'm writing a MP4 video file with a AVAssetWriter using a AVAssetWriterInputPixelBufferAdaptor.
The source is a video from a UIImagePickerController, either freshly captured from the camera or from the asset library. Quality right now is UIImagePickerControllerQualityTypeMedium.
Some times the writer fails. It's status is AVAssetWriterStatusFailed and the AVAssetWriter objects error property is:
Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed"
UserInfo=0xf5d8990 {NSLocalizedFailureReason=An unknown error occurred (-536870210),
NSUnderlyingError=0x4dd8e0 "The operation couldn’t be completed. (OSStatus error -536870210.)",
NSLocalizedDescription=The operation could not be completed
The error occurs approximately 20% of the times the code is run. It seems to fail more frequently on iPhone 4 / 4S than on iPhone 5.
It also occurs more frequently if the source video quality is higher.
Using UIImagePickerControllerQualityTypeLow the error doesn't happen so often.
Using UIImagePickerControllerQualityTypeHigh, the error happens a little more frequently.
I have also noticed something else:
It seems to come in waves. When it fails, the following runs will often fail too, even though I delete the app and reinstall it. That leaves me wondering, whether my program leaks some memory and if that memory stays alive even if the app gets killed (is that even possible?).
Here is the code i use to render my video:
- (void)writeVideo
{
offlineRenderingInProgress = YES;
/* --- Writer Setup --- */
[locationQueue cancelAllOperations];
[self stopWithoutRewinding];
NSError *writerError = nil;
BOOL succes;
succes = [[NSFileManager defaultManager] removeItemAtURL:self.outputURL error:nil];
// DLog(#"Url: %#, succes: %i, error: %#", self.outputURL, succes, fileError);
writer = [AVAssetWriter assetWriterWithURL:self.outputURL fileType:(NSString *)kUTTypeQuickTimeMovie error:&writerError];
//writer.shouldOptimizeForNetworkUse = NO;
if (writerError) {
DLog(#"Writer error: %#", writerError);
return;
}
float bitsPerPixel;
CMVideoDimensions dimensions = CMVideoFormatDescriptionGetDimensions((__bridge CMVideoFormatDescriptionRef)([readerVideoOutput.videoTracks[0] formatDescriptions][0]));
int numPixels = dimensions.width * dimensions.height;
int bitsPerSecond;
// Assume that lower-than-SD resolutions are intended for streaming, and use a lower bitrate
if ( numPixels < (640 * 480) )
bitsPerPixel = 4.05; // This bitrate matches the quality produced by AVCaptureSessionPresetMedium or Low.
else
bitsPerPixel = 11.4; // This bitrate matches the quality produced by AVCaptureSessionPresetHigh.
bitsPerSecond = numPixels * bitsPerPixel;
NSDictionary *videoCompressionSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
[NSNumber numberWithFloat:videoSize.width], AVVideoWidthKey,
[NSNumber numberWithInteger:videoSize.height], AVVideoHeightKey,
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger:30], AVVideoMaxKeyFrameIntervalKey,
nil], AVVideoCompressionPropertiesKey,
nil];
writerVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoCompressionSettings];
writerVideoInput.transform = movie.preferredTransform;
writerVideoInput.expectsMediaDataInRealTime = YES;
[writer addInput:writerVideoInput];
NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_32ARGB], kCVPixelBufferPixelFormatTypeKey, nil];
writerPixelAdaptor = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput
sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];
BOOL couldStart = [writer startWriting];
if (!couldStart) {
DLog(#"Could not start AVAssetWriter!");
abort = YES;
[locationQueue cancelAllOperations];
return;
}
[self configureFilters];
CIContext *offlineRenderContext = [CIContext contextWithOptions:#{kCIContextUseSoftwareRenderer : #NO}];
CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
if (!self.canEdit) {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeFromTimeToTime(kCMTimeZero, kCMTimePositiveInfinity) forOfflineRender:YES];
} else {
[self createVideoReaderWithAsset:movie timeRange:CMTimeRangeWithNOVideoRangeInDuration(self.thumbnailEditView.range, movie.duration) forOfflineRender:YES];
}
CMTime startOffset = reader.timeRange.start;
DLog(#"startOffset: %llu", startOffset.value);
[self.thumbnailEditView removeFromSuperview];
// self.thumbnailEditView = nil;
[glLayer removeFromSuperlayer];
glLayer = nil;
[playerView removeFromSuperview];
playerView = nil;
glContext = nil;
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{
#try {
BOOL didWriteSomething = NO;
DLog(#"Preparing to write...");
while ([writerVideoInput isReadyForMoreMediaData]) {
if (abort) {
NSLog(#"Abort == YES");
[locationQueue cancelAllOperations];
[writerVideoInput markAsFinished];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
if (writer.status == AVAssetWriterStatusFailed) {
DLog(#"Writer.status: AVAssetWriterStatusFailed, error: %#", writer.error);
[[NSUserDefaults standardUserDefaults] setObject:[NSNumber numberWithInt:1] forKey:#"QualityOverride"];
[[NSUserDefaults standardUserDefaults] synchronize];
abort = YES;
[locationQueue cancelAllOperations];
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
return;
DLog(#"Source file exists: %i", [[NSFileManager defaultManager] fileExistsAtPath:movie.URL.relativePath]);
}
DLog(#"Writing started...");
CMSampleBufferRef buffer = nil;
if (reader.status != AVAssetReaderStatusUnknown) {
if (reader.status == AVAssetReaderStatusReading) {
buffer = [readerVideoOutput copyNextSampleBuffer];
if (didWriteSomething == NO) {
DLog(#"Copying sample buffers...");
}
}
if (!buffer) {
[writerVideoInput markAsFinished];
DLog(#"Finished...");
CGColorSpaceRelease(colorSpace);
[self offlineRenderingDidFinish];
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{
[writer finishWriting];
if (writer.error != nil) {
DLog(#"Error: %#", writer.error);
} else {
DLog(#"Succes!");
}
if (writer.status == AVAssetWriterStatusCompleted) {
videoConvertCompletionBlock(YES, nil);
}
else {
abort = YES;
videoConvertCompletionBlock(NO, writer.error.localizedDescription);
}
});
return;
}
didWriteSomething = YES;
}
else {
DLog(#"Still waiting...");
//Reader just needs a moment to get ready...
continue;
}
CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(buffer);
if (pixelBuffer == NULL) {
DLog(#"Pixelbuffer == NULL");
continue;
}
//DLog(#"Sample call back! Pixelbuffer: %lu", CVPixelBufferGetHeight(pixelBuffer));
//NSDictionary *options = [NSDictionary dictionaryWithObject:(__bridge id)CGColorSpaceCreateDeviceRGB() forKey:kCIImageColorSpace];
CIImage *ciimage = [CIImage imageWithCVPixelBuffer:pixelBuffer options:nil];
CIImage *outputImage = [self filteredImageWithImage:ciimage];
CVPixelBufferRef outPixelBuffer = NULL;
CVReturn status;
CFDictionaryRef empty; // empty value for attr value.
CFMutableDictionaryRef attrs;
empty = CFDictionaryCreate(kCFAllocatorDefault, // our empty IOSurface properties dictionary
NULL,
NULL,
0,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
attrs = CFDictionaryCreateMutable(kCFAllocatorDefault,
1,
&kCFTypeDictionaryKeyCallBacks,
&kCFTypeDictionaryValueCallBacks);
CFDictionarySetValue(attrs,
kCVPixelBufferIOSurfacePropertiesKey,
empty);
CFDictionarySetValue(attrs,
kCVPixelBufferCGImageCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
CFDictionarySetValue(attrs,
kCVPixelBufferCGBitmapContextCompatibilityKey,
(__bridge const void *)([NSNumber numberWithBool:YES]));
status = CVPixelBufferCreate(kCFAllocatorDefault, ciimage.extent.size.width, ciimage.extent.size.height, kCVPixelFormatType_32BGRA, attrs, &outPixelBuffer);
//DLog(#"Output image size: %f, %f, pixelbuffer height: %lu", outputImage.extent.size.width, outputImage.extent.size.height, CVPixelBufferGetHeight(outPixelBuffer));
if (status != kCVReturnSuccess) {
DLog(#"Couldn't allocate output pixelBufferRef!");
continue;
}
[offlineRenderContext render:outputImage toCVPixelBuffer:outPixelBuffer bounds:outputImage.extent colorSpace:colorSpace];
CMTime currentSourceTime = CMSampleBufferGetPresentationTimeStamp(buffer);
CMTime currentTime = CMTimeSubtract(currentSourceTime, startOffset);
CMTime duration = reader.timeRange.duration;
if (CMTIME_IS_POSITIVE_INFINITY(duration)) {
duration = movie.duration;
}
CMTime durationConverted = CMTimeConvertScale(duration, currentTime.timescale, kCMTimeRoundingMethod_Default);
float durationFloat = (float)durationConverted.value;
float progress = ((float) currentTime.value) / durationFloat;
//DLog(#"duration : %f, progress: %f", durationFloat, progress);
[self updateOfflineRenderProgress:progress];
if (pixelBuffer != NULL && writerVideoInput.readyForMoreMediaData) {
[writerPixelAdaptor appendPixelBuffer:outPixelBuffer withPresentationTime:currentTime];
} else {
continue;
}
if (writer.status == AVAssetWriterStatusWriting) {
DLog(#"Writer.status: AVAssetWriterStatusWriting");
}
CFRelease(buffer);
CVPixelBufferRelease(outPixelBuffer);
}
}
#catch (NSException *exception) {
DLog(#"Catching exception: %#", exception);
}
}];
}
Ok, I think I solved it myself. The bad guy was this line:
[writerVideoInput requestMediaDataWhenReadyOnQueue:dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0) usingBlock:^{ ....
The global queue I was passing is a concurrent queue. This allows a new callback to be made before the previous one is finished. The asset writer is not designed to be written to from more than one thread at a time.
Creating and using a new serial queue seems to remedy the problem:
assetWriterQueue = dispatch_queue_create("AssetWriterQueue", DISPATCH_QUEUE_SERIAL);
[writerVideoInput requestMediaDataWhenReadyOnQueue:assetWriterQueue usingBlock:^{...

Audio mixing on iPhone : Need advice

I'm using AudioQueue Service to play audio in my app.
I need to play several audio files together. What I do now I just create as much audio queue as much i need to play files. I.e. I create two audio queue for two audio files and start them at the same time to have audio mixing effect.
So basically I would like to know is this an "elegant" way of doing it.
Please note, that I'm aware of Audio Unit service and its MixerHost example, please do not suggest that option, I need to do sound mixing exclusively using audio queue service.
- (void) setUpAndAddAudioAtPath:(NSURL*)assetURL toComposition:(AVMutableComposition *)composition {
AVURLAsset *songAsset = [AVURLAsset URLAssetWithURL:assetURL options:nil];
AVMutableCompositionTrack *track = [composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
AVAssetTrack *sourceAudioTrack = [[songAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
NSError *error = nil;
BOOL ok = NO;
CMTime startTime = CMTimeMakeWithSeconds(0, 1);
CMTime trackDuration = songAsset.duration;
CMTime longestTime = CMTimeMake(848896, 44100); //(19.24 seconds)
CMTimeRange tRange = CMTimeRangeMake(startTime, trackDuration);
//Set Volume
AVMutableAudioMixInputParameters *trackMix = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track];
[trackMix setVolume:0.8f atTime:startTime];
[audioMixParams addObject:trackMix];
//Insert audio into track
ok = [track insertTimeRange:tRange ofTrack:sourceAudioTrack atTime:CMTimeMake(0, 44100) error:&error];
}
- (BOOL) exportAudio {
if (defaultSoundPath == nil || recordingSoundPath == nil) {
[actvityIdicatiorView stopAnimating];
[actvityIdicatiorView setHidden:YES];
UIAlertView *alertView=[[UIAlertView alloc]initWithTitle:#"Select Sound" message:#"Both Sound is selected" delegate:self cancelButtonTitle:#"Ok" otherButtonTitles:nil];
[alertView show];
return NO;
}
AVMutableComposition *composition = [AVMutableComposition composition];
if (audioMixParams) {
[audioMixParams release];
audioMixParams=nil;
}
audioMixParams = [[NSMutableArray alloc] initWithObjects:nil];
//Add Audio Tracks to Composition
NSString *sourceA= [[NSBundle mainBundle] pathForResource:#"Beach Soundscape" ofType:#"mp3"];
//NSString *URLPath1 = pathToYourAudioFile1;
NSURL *assetURL1 = [NSURL fileURLWithPath:sourceA];
[self setUpAndAddAudioAtPath:assetURL1 toComposition:composition];
NSString *sourceB = [[NSBundle mainBundle] pathForResource:#"DrumsMonoSTP" ofType:#"aif"];
// NSString *URLPath2 = pathToYourAudioFile2;
NSURL *assetURL2 = [NSURL fileURLWithPath:sourceB];
[self setUpAndAddAudioAtPath:assetURL2 toComposition:composition];
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
audioMix.inputParameters = [NSArray arrayWithArray:audioMixParams];
//If you need to query what formats you can export to, here's a way to find out
NSLog (#"compatible presets for songAsset: %#",
[AVAssetExportSession exportPresetsCompatibleWithAsset:composition]);
AVAssetExportSession *exporter = [[AVAssetExportSession alloc]
initWithAsset: composition
presetName: AVAssetExportPresetAppleM4A];
exporter.audioMix = audioMix;
exporter.outputFileType = #"com.apple.m4a-audio";
// NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
//
// NSString *fileName = #"someFilename";
//NSString *exportFile = [[paths objectAtIndex:0] stringByAppendingFormat: #"/%#.m4a", fileName];
mixingSoundPath= [[self mixingSoundFolder] stringByAppendingFormat: #"/Mixing%#.m4a", [self dateString]];
[mixingSoundPath retain];
// set up export
//myDeleteFile(exportFile);
NSURL *exportURL = [NSURL fileURLWithPath:mixingSoundPath];
exporter.outputURL = exportURL;
static BOOL isComplete;
// do the export
[exporter exportAsynchronouslyWithCompletionHandler:^{
int exportStatus = exporter.status;
NSLog(#"exporter.......%i",exportStatus);
switch (exportStatus) {
case AVAssetExportSessionStatusFailed:
// NSError *exportError =exporter.error;
isComplete=NO;
NSLog (#"AVAssetExportSessionStatusFailed");
NSLog (#"Error == %#", exporter.error);
break;
case AVAssetExportSessionStatusCompleted:
[self mixingDidFinshing];
isComplete=YES;
break;
case AVAssetExportSessionStatusUnknown:
NSLog (#"AVAssetExportSessionStatusUnknown");
isComplete=NO;
break;
case AVAssetExportSessionStatusExporting:
isComplete=NO;
NSLog (#"AVAssetExportSessionStatusExporting");
break;
case AVAssetExportSessionStatusCancelled:
isComplete=NO;
NSLog (#"AVAssetExportSessionStatusCancelled");
break;
case AVAssetExportSessionStatusWaiting:
isComplete=NO;
NSLog (#"AVAssetExportSessionStatusWaiting");
break;
default:
NSLog (#"didn't get export status");
isComplete=NO;
break;
}
}];
return isComplete;
}

NSOuputStream writing multiple times

I'm trying to use the NSStream objects to open and then write and read on a socket but i have a problem.
I don't know how to write on the socket, after i have opened it.
Here is how i have done
1) first openning the socket :
NSURL *website = [NSURL URLWithString:urlStr];
if (!website) {
NSLog(#"%# is not a valid URL");
return;
}
NSHost *host = [NSHost hostWithName:urlStr];
// iStream and oStream are instance variables
[NSStream getStreamsToHost:host port:6667 inputStream:&iStream
outputStream:&oStream];
[iStream retain];
[oStream retain];
[iStream setDelegate:self];
[oStream setDelegate:self];
[iStream scheduleInRunLoop:[NSRunLoop currentRunLoop]
forMode:NSDefaultRunLoopMode];
[oStream scheduleInRunLoop:[NSRunLoop currentRunLoop]
forMode:NSDefaultRunLoopMode];
[iStream open];
[oStream open];
2) Set the loop :
- (void)stream:(NSStream *)theStream handleEvent:(NSStreamEvent)streamEvent
{
NSString *io;
if (theStream == iStream) io = #">>";
else io = #"<<";
NSLog(#"stream : %#",theStream);
NSString *event;
switch (streamEvent)
{
case NSStreamEventNone:
event = #"NSStreamEventNone";
break;
case NSStreamEventOpenCompleted:
event = #"NSStreamEventOpenCompleted";
break;
case NSStreamEventHasBytesAvailable:{
event = #"NSStreamEventHasBytesAvailables";
if (theStream == iStream)
{
if(!_data) {
_data = [[NSMutableData data] retain];
}
uint8_t buf[1024];
unsigned int len = 0;
len = [iStream read:buf maxLength:1024];
NSLog(#"Lenght data read : %d", len);
if(len) {
NSData * dataReceived= [[NSString stringWithFormat:#"%s\n", (char *)buf] dataUsingEncoding:NSUTF8StringEncoding];
NSString *s = [[NSString alloc] initWithData:dataReceived encoding:NSUTF8StringEncoding];
NSLog(#"Received _data: \"%#\"\n",s);
} else {
NSLog(#"nothing to read!");
}
}else {
NSLog(#"Not the good stream");
}
break;
}
case NSStreamEventHasSpaceAvailable:{
event = #"NSStreamEventHasSpaceAvailable";
if (theStream == oStream )
{
if(isConnexionCommandSent == NO){
[self sendCommand:#"My connection command"];
isConnexionCommandSent = YES;
}
}
break;
}
case NSStreamEventErrorOccurred:
event = #"NSStreamEventErrorOccurred";
NSError *theError = [theStream streamError];
UIAlertView *alert = [[UIAlertView alloc] initWithTitle:#"Error" message:[theError localizedDescription]
delegate:nil cancelButtonTitle:#"OK" otherButtonTitles: nil];
[alert show];
[alert release];
break;
case NSStreamEventEndEncountered:
event = #"NSStreamEventEndEncountered";
break;
default:
event = #"** Unknown";
}
NSLog(#"%# : %#", io, event);
}
3) then i have a function that is called when I touch a button
- (IBAction)join:(id)sender{
if([oStream hasSpaceAvailable]){
NSLog(#"iStream Status : %d",[iStream streamStatus]);
NSLog(#"oStream Status : %d",[oStream streamStatus]);
[self sendCommand:#"join"];
}else{
NSLog(#"Error command can't be sent");
}
}
-(void) sendCommand:(NSString *) command{
NSLog(#"space : %d",[oStream hasSpaceAvailable]);
if ([oStream hasSpaceAvailable])
{
NSLog(#"Command writen : %s\n",[command cStringUsingEncoding:NSASCIIStringEncoding]);
NSInteger i=[oStream write:(const uint8_t *)[command cStringUsingEncoding:NSASCIIStringEncoding] maxLength:(NSInteger)[command lengthOfBytesUsingEncoding:NSASCIIStringEncoding]];
if (i<0)
{
NSLog(#"erreur lors de l'envoi, status:%i, erreur:%#", [oStream streamStatus], [oStream streamError]);
}
isReadyToSend = NO;
}
else
{
NSLog(#"impossible d'envoyer, status:%i, erreur:%#", [oStream streamStatus], [oStream streamError]);
}
}
But the problem is that when the function join is called, everything goes fine, but the server receives nothing ...
On
NSInteger i=[oStream write:(const uint8_t *)[command cStringUsingEncoding:NSASCIIStringEncoding] maxLength:(NSInteger)[command lengthOfBytesUsingEncoding:NSASCIIStringEncoding]];
i is > 0, so i assume that the writing went well, but on the server nothing is received ... i don't know why ...
Can you help me?
Hey #Ptitaw see this post. I believe there you might find your answer and an easier way to connect and get access automatically to all events (reading, writing, etc..)
Hope I could help :)
A Very very late answer, however, it might be helpful to someone having similar issue.
I think it may be due to data encoding mechanism. Your server might be using UTF-8 encoding and you are sending your data using NSASCIIStringEncoding. Try this:
NSInteger i=[oStream write:(const uint8_t *)[command cStringUsingEncoding:NSUTF8StringEncoding] maxLength:(NSInteger)[command lengthOfBytesUsingEncoding:NSUTF8StringEncoding]];