AQRecorder AAC play in AVPlayer - objective-c

I have some problem with recorded aac audio file on iOS.
I need record audio and play it with AVPlayer. Recording work fine, file created. i can play it on Mac. But when i try play it(file) in AVPlayer - no sound.
property for record audio
mRecordFormat.mSampleRate = 8000;
mRecordFormat.mFormatID = kAudioFormatMPEG4AAC;
mRecordFormat.mFormatFlags = kAudioFormatMPEG4AAC_LD;
mRecordFormat.mFramesPerPacket = 0;
mRecordFormat.mChannelsPerFrame = 1;
mRecordFormat.mBitsPerChannel = 0;
mRecordFormat.mBytesPerPacket = 0;
i try different setting but no results. Help please
P.S. Other files(mp3 etc) playing normaly

Ok, i find solution couple years ago.
I change record logic to
NSDictionary *settings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kAudioFormatMPEG4AAC], AVFormatIDKey,
[NSNumber numberWithFloat:16000], AVSampleRateKey,
[NSNumber numberWithInt:2], AVNumberOfChannelsKey,
[NSNumber numberWithInt:AVAudioQualityMedium], AVSampleRateConverterAudioQualityKey,
[NSNumber numberWithInt:64000], AVEncoderBitRateKey,
[NSNumber numberWithInt:8], AVEncoderBitDepthHintKey,
nil];
error = nil;
_recorder = [[AVAudioRecorder alloc] initWithURL:url settings:settings error:&error];
and this recorded file play on iOS, Android and MacOS/Win. So it's all.

Related

Error while recording multiple screens using AVFoundation

I am trying to record primary and secondary monitor screens into two separate files using below code.
const uint32_t MAX_DISPLAY = 2;
CGDirectDisplayID displays[MAX_DISPLAY] = {0};
CGGetActiveDisplayList(MAX_DISPLAY, displays, &m_nDisplays);
NSString* dest_file[2] = {0};
NSURL* dest_path[2] = {0};
AVCaptureConnection *CaptureConnection[2] = {0};
NSDictionary *videoCleanApertureSettings = [NSDictionary dictionaryWithObjectsAndKeys:
#2048, AVVideoCleanApertureWidthKey,
#1152, AVVideoCleanApertureHeightKey,
#0, AVVideoCleanApertureHorizontalOffsetKey,
#0, AVVideoCleanApertureVerticalOffsetKey,
nil];
NSDictionary *videoAspectRatioSettings = [NSDictionary dictionaryWithObjectsAndKeys:
#3,AVVideoPixelAspectRatioHorizontalSpacingKey,
#3,AVVideoPixelAspectRatioVerticalSpacingKey,
nil];
NSNumber* bitsPerSecond = [NSNumber numberWithDouble:1024*1000];
NSDictionary *codecSettings = [NSDictionary dictionaryWithObjectsAndKeys:
bitsPerSecond, AVVideoAverageBitRateKey,
videoCleanApertureSettings, AVVideoCleanApertureKey,
videoAspectRatioSettings, AVVideoPixelAspectRatioKey,
nil];
NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
AVVideoCodecH264, AVVideoCodecKey,
codecSettings,AVVideoCompressionPropertiesKey,
AVVideoScalingModeResize,AVVideoScalingModeKey,
#2048, AVVideoWidthKey,
#1152, AVVideoHeightKey,
nil];
for( int nIdx = 0; nIdx < m_nDisplays; ++nIdx )
{
m_session[nIdx] = [[AVCaptureSession alloc] init];
dest_file[nIdx] = [NSString stringWithFormat:#"%#_%d.MOV",destination_path,nIdx];
dest_path[nIdx] = [NSURL fileURLWithPath: dest_file[nIdx] ];
// Create a ScreenInput with the display and add it to the session
m_movie_file_input[nIdx] = [[[AVCaptureScreenInput alloc] initWithDisplayID:displays[nIdx]] autorelease];
[m_movie_file_input[nIdx] removesDuplicateFrames ];
if ([m_session[nIdx] canAddInput:m_movie_file_input[nIdx]])
{
[m_session[nIdx] addInput:m_movie_file_input[nIdx]];
}
// Create a MovieFileOutput and add it to the session
m_movie_file_output[nIdx] = [[[AVCaptureMovieFileOutput alloc] init] autorelease];
if ([m_session[nIdx] canAddOutput:m_movie_file_output[nIdx]])
{
[m_session[nIdx] addOutput:m_movie_file_output[nIdx]];
}
CaptureConnection[nIdx] = [m_movie_file_output[nIdx] connectionWithMediaType:AVMediaTypeVideo];
[m_movie_file_output[nIdx] setOutputSettings : videoSettings forConnection : CaptureConnection[nIdx]];
// Start running the session
[m_session[nIdx] startRunning];
[m_movie_file_output[nIdx] startRecordingToOutputFileURL:dest_path[nIdx] recordingDelegate:self];
}
I am getting both the screens saved into two separate files. But while calling startRecordingToOutputFileURL API for the secondary monitor i.e. for the second pass of loop, I am getting an error as shown below :
VTCompressionSessionCreate signalled err=-8973 (err)
(VTVideoEncoderStartSession failed) at
/SourceCache/CoreMedia_frameworks/CoreMedia-1562.19/Sources/VideoToolbox/VTCompressionSession.c
line 897
Also compression parameters(bitrate) are not setting properly for Secondary monitor, it takes some different values other than the one I have specified in the program.
Can somebody please help me on this ? Also please let me know this is the proper way of doing this.
Thanks in Advance
George

AVAssetWriterInput Settings Quality vs. Filesize

Here's a rather specific question that's left me stumped. I'm writing a video recording software that captures video data from a webcam to an MP4. My software is going to replace a script that's already in place that triggers QuickTime Player to do the same but output to a MOV.
I'm using AVFoundation and have the capturing and saving in place but, after repeated tweaks and tests, I've found that the script that's already in place consistently creates MOV files with a higher video quality and lower file sizes than my software.
Here is a link to two samples, a MOV created by the on-site script and an MP4 created by my software: https://www.dropbox.com/sh/1qnn5afmrquwfcr/AADQvDMWkbYJwVNlio9_vbeNa?dl=0
The MOV was created by a colleague and is the quality and file size I'm trying to match with my software. The MP4 was recording in my office, obviously a different lighting situation, but with the same camera as is used on-site.
Comparing the two videos, I can see that they have the same dimensions, duration, and video codec but differ in both file size and quality.
Here is the code where I set up my AVAssetWriter and AVAssetWriter input:
NSDictionary *settings = nil;
settings = [NSDictionary dictionaryWithObjectsAndKeys:
// Specify H264 (MPEG4) as the codec
AVVideoCodecH264, AVVideoCodecKey,
// Specify the video width and height
[NSNumber numberWithInt:self.recordSize.width], AVVideoWidthKey,
[NSNumber numberWithInt:self.recordSize.height], AVVideoHeightKey,
// Specify the video compression
[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInteger:2500000], AVVideoAverageBitRateKey,
[NSNumber numberWithInt:1], AVVideoMaxKeyFrameIntervalKey,
//AVVideoProfileLevelH264Baseline30, AVVideoProfileLevelKey, Not available on 10.7
nil], AVVideoCompressionPropertiesKey,
// Specify the HD output color
[NSDictionary dictionaryWithObjectsAndKeys:
AVVideoColorPrimaries_ITU_R_709_2, AVVideoColorPrimariesKey,
AVVideoTransferFunction_ITU_R_709_2, AVVideoTransferFunctionKey,
AVVideoYCbCrMatrix_ITU_R_709_2, AVVideoYCbCrMatrixKey, nil], AVVideoColorPropertiesKey,
nil];
self.assetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:settings];// sourceFormatHint:self.formatHint];
/*self.pixelBufferAdaptor = [[AVAssetWriterInputPixelBufferAdaptor alloc]
initWithAssetWriterInput:self.assetWriterInput
sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange],//kCVPixelFormatType_32BGRA],
kCVPixelBufferPixelFormatTypeKey,
nil]];*/
NSError *error;
self.videoFile = [NSURL fileURLWithPath:file];
//[self.movieFileOutput startRecordingToOutputFileURL:self.videoFile recordingDelegate:self];
self.assetWriter = [[AVAssetWriter alloc]
initWithURL:self.videoFile
fileType:AVFileTypeMPEG4//AVFileTypeQuickTimeMovie//
error:&error];
if (self.assetWriter){
[self.assetWriter addInput:self.assetWriterInput];
self.assetWriterInput.expectsMediaDataInRealTime = YES;
And the code where I set up my AVCaptureVideoDataOutput and add it to my capture session:
AVCaptureVideoDataOutput *output = [[AVCaptureVideoDataOutput alloc] init];
output.alwaysDiscardsLateVideoFrames = NO;
output.videoSettings = nil;
self.videoQueue = dispatch_queue_create("ca.blackboxsoftware.avcapturequeue", NULL);
[output setSampleBufferDelegate:self queue:self.videoQueue];
[self.session addOutput:output];
This quality issue is the big stumbling block of my software and I desperately need your help with it. I'll be happy to post any other code you might need to see and test out changes you feel would make the difference.
Thank you for your time.

Creating a CVPixelBufferRef from a IDeckLinkVideoInputFrame

I'm using the BlackMagic DeckLink SDK to try capture frames from a BM device.
I'm trying to grab the pixel data from a IDeckLinkVideoInputFrame in the DeckLinkController::VideoInputFrameArrived callback and convert it to a CVPixelBufferRef to be able to write it to disk with AVFoundation's AVAssetWriterInputPixelBufferAdaptor and AVAssetWriter. The code I'm using seems to be working, apart from the fact that all frames written to disk are green. (BlackMagic's example code that generates a preview on screen does show an image, so the device and device settings should be OK).
The AVAssetWriter is set up as follows:
writer = [[AVAssetWriter assetWriterWithURL:destinationUrl
fileType:AVFileTypeAppleM4V
error:&error] retain];
if(error)
NSLog(#"ERROR: %#", [error localizedDescription]);
NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
[outputSettings setObject: AVVideoCodecH264
forKey: AVVideoCodecKey];
[outputSettings setObject: [NSNumber numberWithInt:1920]
forKey: AVVideoWidthKey];
[outputSettings setObject: [NSNumber numberWithInt:1080]
forKey: AVVideoHeightKey];
NSMutableDictionary * compressionProperties = [[NSMutableDictionary alloc] init];
[compressionProperties setObject: [NSNumber numberWithInt: 1000000]
forKey: AVVideoAverageBitRateKey];
[compressionProperties setObject: [NSNumber numberWithInt: 16]
forKey: AVVideoMaxKeyFrameIntervalKey];
[compressionProperties setObject: AVVideoProfileLevelH264Main31
forKey: AVVideoProfileLevelKey];
[outputSettings setObject: compressionProperties
forKey: AVVideoCompressionPropertiesKey];
writerVideoInput = [[AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings] retain];
NSMutableDictionary * pixBufSettings = [[NSMutableDictionary alloc] init];
[pixBufSettings setObject: [NSNumber numberWithInt: kCVPixelFormatType_422YpCbCr8_yuvs]
forKey: (NSString *) kCVPixelBufferPixelFormatTypeKey];
[pixBufSettings setObject: [NSNumber numberWithInt: 1920]
forKey: (NSString *) kCVPixelBufferWidthKey];
[pixBufSettings setObject: [NSNumber numberWithInt: 1080]
forKey: (NSString *) kCVPixelBufferHeightKey];
writerVideoInput.expectsMediaDataInRealTime = YES;
writer.shouldOptimizeForNetworkUse = NO;
adaptor = [[AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:writerVideoInput
sourcePixelBufferAttributes:pixBufSettings] retain];
[writer addInput:writerVideoInput];
For reference, these output settings and compression options should be correct, but I have tried several different alternatives.
When a frame comes in from the device, I convert it to a CVPixelBufferRef as follows:
void *videoData;
int64_t frameTime;
int64_t frameDuration;
videoFrame->GetBytes(&videoData);
videoFrame->GetStreamTime(&frameTime, &frameDuration, 3000);
CMTime presentationTime = CMTimeMake(frameDuration, 3000);
CVPixelBufferRef buffer = NULL;
CVPixelBufferPoolCreatePixelBuffer(NULL, adaptor.pixelBufferPool, &buffer);
CVPixelBufferLockBaseAddress(buffer, 0);
void *rasterData = CVPixelBufferGetBaseAddress(buffer);
memcpy(rasterData, videoData, (videoFrame->GetRowBytes()*videoFrame->GetHeight()));
CVPixelBufferUnlockBaseAddress(buffer, 0);
if (buffer)
{
if(![adaptor appendPixelBuffer:buffer withPresentationTime:presentationTime]) {
NSLog(#"ERROR appending pixelbuffer: %#", writer.error);
[writerVideoInput markAsFinished];
if(![writer finishWriting])
NSLog(#"ERROR finishing writing: %#", [writer.error localizedDescription]);
}
else {
NSLog(#"SUCCESS");
if(buffer)
CVPixelBufferRelease(buffer);
}
}
This code is appending frames to the AVAssetWriterInputPixelBufferAdaptor, but all the frames are green.
Can anybody see what I'm doing wrong here, or does anybody have any experience using AVFoundation capturing and compressing frames using the BlackMagic Decklink SDK?
When you see 'green' and are working in the YUV color space, you are seeing values of 0 in the buffer. Since AVWriter is writing frames, the odds are that 'buffer' contains values of 0. I see a couple of ways that could happen.
1) The buffer your are appending is most likely initialized with 0, so it is possible your copy is failing. In your code that could happen if (videoFrame->GetRowBytes()*videoFrame->GetHeight()) somehow evaluates to 0. It seems impossible, but I'd check that.
2) The CVPixelBufferGetBaseAddress is either returning the wrong pointer, the PixelBuffer itself is the wrong format or possible invalid (yet didn't crash because of safeguards in the API).
3) 'videoData' is, for whatever reason, itself full of 0. DeckLinkCaptureDelegate returns frames with nothing in them when it doesn't like the input format (usually this is because the BMDDisplayMode passed to EnableVideoInput doesn't match your video source.
int flags=videoFrame->GetFlags();
if (flags & bmdFrameHasNoInputSource)
{
//our input format doesn't match the source
}
Other than changing your source mode and trying again, a quick check would be to change the memcpy line to the following:
memset(rasterData, 0x3f, 1920*1080*2);
If you still see green frames then take a hard look at #2. If you see different colored frames, then your problem is #1 or #3 and most likely the resolution of your video input doesn't match the BMDDisplayMode that you chose.
One other thing to note. I think the line where you create the presentation time is wrong. It probably should be (note changing frameDuration to frameTime:
CMTime presentationTime = CMTimeMake(frameTime, 3000);

AVAudioPlayer not playing recording made with AVAudioRecorder

This may be a really simple problem but I've made a recording using AVAudioRecorder, then I stopped the recorder. After stopping the AVAudioRecorder, I press another button to play the recording but it doesn't play. The file exists, I can play it on my computer, even the code knows it exists, there is no error, but refuses to play. What can be the issue?
NSError *error = nil;
if ([[NSFileManager defaultManager] fileExistsAtPath:[self.recorder.url path]]) {
self.recordingPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:self.recorder.url error:&error];
self.recordingPlayer.delegate = self;
if (error) {
NSLog(#"error: %#", [error localizedDescription]);
} else {
[self.recordingPlayer play];
}
} else {
NSLog(#"Recording file doesn't exist");
}
EDIT: just tried it on my device and it works fine, plays the recording. It just doesn't work on iOS simulator
The problem was with my record settings, I had the number of AVNumberOfChannelsKey set to 2, for some reason iOS simulator didn't like this, but iPhone was fine with it. Either way, my recording shouldn't have 2 channels in the first place, so good thing I spotted this.
NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:AVAudioQualityMax], AVEncoderAudioQualityKey,
[NSNumber numberWithInt:16], AVEncoderBitRateKey,
[NSNumber numberWithInt:1], AVNumberOfChannelsKey,
[NSNumber numberWithFloat:44100.0], AVSampleRateKey,
nil];
Try this link's code. In this you can record an audio and retrieve from document folder.
Record audio file and save locally on iPhone

iPad iOS4 dictionary count returns 0

I am checking that the entries in my dictionary are not 0 using NSDictionary count. It works and returns the correct number except on iPad 4.3 Simulator and iPads running iOS 4.3.
Is this a known bug in iOS 4 or am I seeing a side effect of something else I am doing which is iOS 4 incompatible?
edit:
Thank you for your comments so far! I am happy to believe that it's my code; I'm new to this. Here is a greatly simplified version of my code.
-(NSDictionary *)dictionaryOfSets
{
if (!_dictionaryOfSets)
{
NSOrderedSet* set1 = [[NSOrderedSet alloc] initWithObjects:
[NSNumber numberWithInt:(1)],
[NSNumber numberWithInt:(2)],
[NSNumber numberWithInt:(3)],
[NSNumber numberWithInt:(4)],
nil];
NSOrderedSet* set2 = [[NSOrderedSet alloc] initWithObjects:
[NSNumber numberWithInt:(9)],
[NSNumber numberWithInt:(10)],
[NSNumber numberWithInt:(11)],
nil];
_dictionaryOfSets = [[NSDictionary alloc] initWithObjectsAndKeys:
set1, [NSNumber numberWithInt:(1)],
set2, [NSNumber numberWithInt:(2)],
nil];
[set1 release];
[set2 release];
}
return _dictionaryOfSets;
}
I would double check that the dictionary isn't nil. Messaging nil in that case - [dict count] - would just do that - return 0.
After a few weeks of searching and tests I have concluded that NSOrderedSet does not work under iOS 4.3. The same code works under iOS 5.0 and 5.1. I have replaced the NSOrderedSet with NSSet and it now works under iOS 4.3. I am surprised that I haven't found documentation covering that.