Recording only AUDIO using AVCaptureSession - objective-c

I would like to record audio using AVCaptureSession, and Audio only (without video). I followed RosyWriter sample (removing all code concerning video), but when I create my AsseWriter for audio I have an error with hte line :
assetWriter = [[AVAssetWriter alloc] initWithURL:captureURL fileType:(NSString*)kUTTypeAUdio error:&error];
the error is : Invalide file type UTI...
captureURL is as following :
NSURL captureURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#%#", NSTemporaryDirectory(), #"captureTest.mp3"]];
Do you know how to fix it ? I tried with kUTTypeMP3 or kUTTypeMPEG4Audio but nothing change.
I try to use only audio, because I need to do a AVCaptureSession and separate audio from video to send them separatly to a server.
I'm interested in any sample of code that can help me.
Thanks for you help

kUTTypeAudio is too vague - try something like AVFileTypeWAVE or AVFileTypeAppleM4A.
p.s. iOS doesn't have an mp3 encoder (it does have a decoder)

Related

How to stream a video from an https url in objective-c

I'm trying to make a screensaver for Mac, that streams a simple mp4 video from a backend server. I'm not used to coding for Mac or in Objective-c, so I'm pretty lost. Until now I've figured out how to load a video locally, but when I try to switch out the file URL to an HTTPS URL it doesn't seem to work. This is how I'm loading the local mp4 file right now:
NSURL *url = [[NSBundle bundleForClass:MainScript.class] URLForResource: #"video" withExtension: #"mp4"];
AVAsset *asset = [AVAsset assetWithURL: url];
NSArray *assetKeys = #[#"playable", #"hasProtectedContent"];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithAsset: asset automaticallyLoadedAssetKeys:assetKeys];
NSKeyValueObservingOptions options = NSKeyValueObservingOptionOld | NSKeyValueObservingOptionNew;
VideoPlayer *player = [VideoPlayer playerWithPlayerItem: playerItem];
[playerItem addObserver: player forKeyPath: #"status" options: options context: PlayerItemContext];
player.volume = 0;
As I said before, I'm not used to code in Objective-c, so I'm not 100 percent certain that this is the code that creates the local video stream, but I'm pretty sure it is.
What I want to do from here, is to replace the file URL: #"video" with an HTTPS URL like this: #"https://example.com/video". Everything is working on the backend server, so my problem is only how to load the video on a Mac screensaver.
As I understand it, I'm using AVPlayerItem to load the video right now. I don't know if there is a better way to do it, but if there is, please tell me :)
Try using VLCKit, I'm pretty sure that can do HTTPS out of the box.

How to load the next file while current one is playing through HTTP Live Streaming?

I'm working on an app in which I need to stream a large collection of audio files ranging from 5 to 15 seconds each.
I would like to reduce the load time between the files as much as possible.
Main Question:
Is there a way to start buffering the next file (through HLS) while the current one is playing the last segment?
Is AVQueuePlayer an appropriate solution for this on the iOS side?
A detailed explanation will be much appreciated, since I am new both to HTTP Live Streaming and the AV Foundation.
Related Question:
How do radio apps stream their audio with no lag between the songs?
Yes, AVQueuePlayer is an appropriate solution for playing a sequence of audio streamed from the internet via HTTP protocol.
I'm using AVQueuePlayer for quite a while now with excellent results and no lagging between songs. Here is a simple example on how to use AVQueuePlayer:
NSURL *url1 = [NSURL URLWithString:[urlsToPlay objectAtIndex:0]];
NSURL *url2 = [NSURL URLWithString:[urlsToPlay objectAtIndex:1]];
NSURL *url3 = [NSURL URLWithString:[urlsToPlay objectAtIndex:2]];
self.item1 = [[AVPlayerItem alloc] initWithURL:url1];
self.item2 = [[AVPlayerItem alloc] initWithURL:url2];
self.item3 = [[AVPlayerItem alloc] initWithURL:url3];
self.radioPlayerURLs = [[NSArray alloc] initWithObjects:self.item1,self.item2, self.item3, nil];
self.onDemandPlayer = [AVQueuePlayer queuePlayerWithItems:self.radioPlayerURLs];
[self.onDemandPlayer play];
For more details, please consulte Apple documentation:
AVFoundation
AVQueuePlayer

How to deal with AVAudioPlayer correctly in objective-c

i'm working on little game, and i've got problem with background music. I use AVAudioPlayer to play loop music. It's look like below:
NSString *path = [[NSBundle mainBundle] pathForResource:#"background" ofType:#"ogg"];
NSError *error;
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL URLWithString:path] error:&error];
NSLog(#"%#", [error description]);
[player setNumberOfLoops:-1];
player.delegate = self;
[player play];
I've got in Supproting Files file background.ogg, background.mp3, background.wav and no one plays. What is wrong with it?
And when i use NSLog to print description error i've got:
Error Domain=NSOSStatusErrorDomain Code=1954115647 "The operation couldn’t be completed. (OSStatus error 1954115647.)"
Please help.
Some times,just because your audio file's quality is not match what it need,such as sampling rate.Please check up whether your audio sampling rate is lower than 24kHz.
I have tested some audio,And I found that when the audio sampling rate is higher than 44kHz,it work great, but if it is lower than 44kHz & higher than 22kHz, the AudioToolbox can't play the sounds first while AVAudioPlayer can work some times. And if it is lower than 22kHz both them will can't play these audio files.(these audio's file format which I tested just "m4a","mp3","wav")
By the way,If you want to convert the low sampling rate audio to 44kHz "m4a" file use itunes,it also can't work.Don't waste your time on this.
That was so easy. There is one more thing. When you use e.g NSTimer you must ivalidate it in some place. When you use AVAudioPlayer you have to stop it in some place.
[player stop];

AVAsset has no tracks or duration when created from an ALAsset URL

I'm pulling all of the video assets from ALAssetsLibrary (Basically everything that's being recorded from the native camera app). I am then running an enumeration on each video asset that does this to each video:
// The end of the enumeration is signaled by asset == nil.
if (alAsset) {
//Get the URL location of the video
ALAssetRepresentation *representation = [alAsset defaultRepresentation];
NSURL *url = [representation url];
//Create an AVAsset from the given URL
NSDictionary *asset_options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVAsset *avAsset = [[AVURLAsset alloc] initWithURL:url options:asset_options];//[AVURLAsset URLAssetWithURL:url options:asset_options];
//Here is the problem
NSLog([NSString stringWithFormat:#"%i", [avAsset.tracks count]]);
NSLog([NSString stringWithFormat:#"%f", CMTimeGetSeconds(avAsset.duration)]);
}
NSLog is reporting that the AVAsset that I've gotten from my ALAsset has 0 tracks, and has a duration of 0.0 seconds. I checked the url, and it's "assets-library://asset/asset.MOV?id=9F482CF8-B4F6-40C2-A687-0D05F5F25529&ext=MOV" which seems correct. I know alAsset is actually a video, and the correct video, because I've displayed alAsset.thumbnail, and it's shown the correct thumbnail for the video.
All this leads me to believe there's something going wrong in the initialization for avAsset, but for the life of me, I can't figure out what's going wrong. Can anyone help me?
Update:
I think i've confirmed that the url being given to me by ALAssetRepresentation is faulty, which is weird because it gives me the correct thumbnail. I added this code in:
NSLog([NSString stringWithFormat:#"%i", [url checkResourceIsReachableAndReturnError:&error]]);
NSLog([NSString stringWithFormat:#"%#", error]);
It gives me this:
0
Error Domain=NSCocoaErrorDomain Code=4 "The operation couldn’t be completed. (Cocoa error 4.)" UserInfo=0x19df60 {}
I'm still not sure what would cause that. The only thing I'm noticing is the url, which is "assets-library://asset/asset.MOV?id=9F482CF8-B4F6-40C2-A687-0D05F5F25529&ext=MOV" is different from what I've seen as I've been searching around for this. The one i've seen elsewhere looks more like "assets-library://asset/asset.MOV?id=1000000394&ext=MOV", with a number instead of an alphanumeric, dash separated name.
If it helps, I'm using XCode 4.2 Beta, and iOS5. Please let me know if you can think of anything. Thanks.
Okay, looks like it was a bug in the iOS5 beta v1. I upgraded to the newest and it worked. Thanks to those who took a look at my question.

How can I handle separate tracks of an AVURLAsset independently?

Here's my goal: I would like to load a .3gp movie file into an AVURLAsset. I would then like to take the video track and pump the output frames into an OpenGL ES texture. This will be the video playback. I would then like to continue leveraging AVFoundation to play back the audio. The framework is pretty vast, so I'm hoping for some veteran assistance on this one.
I actually have both parts working separately, but something always goes wrong when I try to do both at the same time. Here's my current attempt, in a nutshell (All error handling is omitted for brevity):
I load the .3gp file into the AVURLAsset and load the tracks:
NSURL* fileURL = [[NSBundle mainBundle] URLForResource:someName withExtension:someExtension];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:fileURL options:nil];
[asset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:#"tracks"] completionHandler:^ {/* More Code */}];
In the completion handler, I get a reference to the audio and video track:
// Tracks loaded, grab the audio and video tracks.
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
AVAssetTrack* audioTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0];
Next, I create separate AVMutableCompositions that contain just the audio track and just the video track. I'm not sure if this is completely necessary, but it seems like a good idea and it does also seem to work:
// Make a composition with the video track.
AVMutableComposition* videoComposition = [AVMutableComposition composition];
AVMutableCompositionTrack* videoCompositionTrack = [videoComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[videoCompositionTrack insertTimeRange:[videoTrack timeRange] ofTrack:videoTrack atTime:CMTimeMake(0, 1) error:nil];
// Make a composition with the audio track.
AVMutableComposition* audioComposition = [AVMutableComposition composition];
AVMutableCompositionTrack* audioCompositionTrack = [audioComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
[audioCompositionTrack insertTimeRange:[audioTrack timeRange] ofTrack:audioTrack atTime:CMTimeMake(0, 1) error:nil];
Now I get into specifics of how to handle each track. I'm fairly confident that I have the one-and-only way of handling the video track, which is to create an AVAssetReader for the video composition, and add an AVAssetTrackReaderOutput that was created with the video composition track. By keeping a reference to that track output, I can call its -copyNextSampleBuffer method to get the info I need to pump the video output into an OpenGL ES texture. This works well enough by itself:
// Create Asset Reader and Output for the video track.
NSDictionary* settings = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA] forKey:(NSString *)kCVPixelBufferPixelFormatTypeKey];
_assetReader = [[AVAssetReader assetReaderWithAsset:vComposition error:nil] retain];
_videoTrackOutput = [[AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:vCompositionTrack outputSettings:settings] retain];
[_assetReader addOutput:_videoTrackOutput];
[_assetReader startReading];
What seems to spoil the whole thing is attempting to play back the audio in any way. I'm not really sure which approach to take for the remaining audio track. Just sticking to the realm of AVFoundation, I see two possible approaches. The first is to use an AVPlayer to play the audio composition:
// Create a player for the audio.
AVPlayerItem* audioPlayerItem = [AVPlayerItem playerItemWithAsset:aComposition];
AVPlayer* audioPlayer = [[AVPlayer playerWithPlayerItem:audioPlayerItem] retain];
[audioPlayer play];
This works, inasmuch as I can hear the desired audio. Unfortunately creating this player guarantees that the AVAssetReaderTrackOutput for the video composition fails with a cryptic error when calling -copyNextSampleBuffer:
AVAssetReaderStatusFailed
Error Domain=AVFoundationErrorDomain
Code=-11800 "The operation could not
be completed" UserInfo=0x456e50
{NSLocalizedFailureReason=An unknown
error occurred (-12785),
NSUnderlyingError=0x486570 "The
operation couldn’t be completed.
(OSStatus error -12785.)",
NSLocalizedDescription=The operation
could not be completed}
I'm confused about how they might be interfering with each other, but regardless, that approach seems to be a dead end.
The other option I considered for the audio playback was the AVAudioPlayer class, but I could not get it to work with an AVAsset as a starting point. I attempted to use its -initWithData:error: method with an NSData built by aggregating the contents of CMSampleBufferRefs taken with an approach identical to the one I use on the video track, but it does not appear to be formatted correctly.
At this point, I feel like I'm flailing around blindly, and would love it so very much if someone could tell me if this approach is even feasible. If it's not I would, of course, appreciate a feasible one.
Creating AVMutableCompositions (basically new AVAssets) for each track seems round-about to me, I'd simply use an AVAssetReader on the audio track. Also, your videoComposition doesn't seem to be used anywhere, so why create it?
In any case, to get either solution to work, set your audio session category to kAudioSessionCategory_MediaPlayback and enable kAudioSessionProperty_OverrideCategoryMixWithOthers.
I've never found any documentation that explains why this is necessary.