This application currently plays audio from onine stations.
Basically it has 2 main features:
Play:
On click of a station name or the play button, the fm station starts playing.
Record :
On click of the record link, the recorders starts recording. Clicking it again, stops and replays the recorded audio.
The problem
The streaming recording is not recording clearly.
The current recorder using AVAudioRecorder though records the audio, the sound is noisy.
Could it be because the streaming audio uses audioQueue while the recorder used is AVAudioRecorder which also records from microphone.We want to record only the streaming content.
Note: The AVAudioRecorder when used for voice recording is clear, but not good with recording streaming audio content.
For playing streaming audio, i used code from Mr. Matt Gallagher link is
here
Can you please suggest a better way to record streaming audio.
Is there an existing API like AVAudioRecorder or am I doing something wrong ?
Please Look at this Framework. It provide data for recording while recording Streaming Kit Streaming Kit
Play an MP3 over HTTP
STKAudioPlayer* audioPlayer = [[STKAudioPlayer alloc] init];
[audioPlayer play:#"http://www.abstractpath.com/files/audiosamples/sample.mp3"];
And append its data to NSMutabledata to play offline by using this delegate.
Intercept PCM data just before its played
[audioPlayer appendFrameFilterWithName:#"MyCustomFilter" block:^(UInt32 channelsPerFrame, UInt32 bytesPerFrame, UInt32 frameCount, void* frames)
{
...
}];
Related
I have a problem where I'm using the Kickflip.io (see code base which is in github) base and building off of it, but after about 10+ seconds the video is getting ahead of the audio. I was able to verify that the audio is playing at the correct rate, so its definitely the video.
I've tried adjusting the sample frame rate for the video down to 10 fps, and this does nothing. Based on all the other audio out of sync with video which is out there for FFMpeg, I'm starting to wonder if there's something in the FFMpeg cocoa pod.
My guess is that the issue is in:
https://github.com/Kickflip/kickflip-ios-sdk/blob/master/Kickflip/Outputs/Muxers/HLS/KFHLSWriter.m
I want to be able to monitor audio on headphones before and during the capture of video.
I have an AVCaptureSession set up to capture video and audio.
My idea is to hook and AVCaptureAudioDataOutput instance up to the AVCaptureSession for this and process the CMSampleBufferRefs with a class implementing the AVCaptureAudioDataOutputSampleBufferDelegate protocol.
But I am not sure how to route the audio to the headphones from there.
What would be the most straighforward way to do this (highest level frameworks, general approach)?
I ended up implementing this Audio Unit. The remote i/o audio unit to be precise.
Apple's aurioTouch sample code provides a clear example of how to do this.
In my iPad App, I create an MPMoviePlayerController that plays an MP4 off of a website. In the MP4 there are people talking. In the App, I have noticed that the audio and video are out of sync by half a second or a quarter-second, perhaps. (I can tell this because I view the video in a web browser and there is no lag.)
The only clue that I have to this is that when the MPMoviePlayerController first loads up, the audio starts playing, but the video doesn't--then the video starts playing and it seems to skip a couple of frames to "catch up" to the audio...but it doesn't quite sync.
Seeing as how this class is a "black box" per the Apple Documentation, and none of the existing methods or properties come across as helpful to this problem, I'm a bit stumped. I may have to leave it how it is with the slight lag, rather than face weird workarounds. I wanted to see if anyone has experienced this before or could suggest a solution.
I'm running on Mountain Lion, latest XCode, iPad 2 with iOS 6.
The code I use to generate the controller is:
-(void) startVideoPlaying {
if (!self.theMoviePlayer) {
self.theMoviePlayer = [[MPMoviePlayerController alloc] initWithContentURL: movieURL];
[self.bgView addSubview: self.theMoviePlayer.view];
[self.theMoviePlayer.view setFrame: movieContainer.frame]; // player's frame must match parent's
} else {
[self.theMoviePlayer setContentURL:movieURL];
}
self.theMoviePlayer.shouldAutoplay = YES;
[self.theMoviePlayer play];
}
Thanks for any help.
After a variety of tests, I must conclude that there is something wrong with the encoding on the MP4s that I have been playing. I am not a video codec guru, but I made the following tests, which tell me this:
1) Downloading the MP4, placing it into the app and loading it into the MPMoviePlayerController via a file URL. Audio still out of sync, so not a connectivity issue.
2) Finding another MP4 on the web (something off Vimeo) and streaming it into the Player. Audio synced properly, potentially something wrong with the MP4s I was attempting to play.
3) Downloading the MP4, and using Handbrake to convert it into M4V with a variety of different settings (including the iPad preset). The Audio was synced fine.
Based on this, it seems to me like there's something wrong with the file I was attempting to play rather than the player (or the player can't handle it). Unfortunately, the files I am attempting to play cannot simply be converted, they are part of a large website system and many hundreds of files would have to change, and so on. So, while I have answered my own question, I haven't solved the problem.
Choosing the audio format in which AVFoundation audio samples are captured.
I am familiar with processing of video frames coming from the iPhone camera. There, the AVCaptureVideoDataOutput's videoSettings property could be used to specify the format in which the video frames should be receievd.
For audio, the similar class AVCaptureAudioDataOutput does not have such a property.
However, the AVAudioSettings.h file clearly shows that there exist several audio formats.
How can I choose a format for audio data? I'm basically interested in raw PCM samples with a certain specific bit rate.
You can try OpenAL. Here is the Documentation. Best regards ;)
I have developed a media player using AVFoundation for iOS. I am using AVPlayer to play the audio-video files (eg. a mp4 file). It seems quiet simple to play the file using AVPlayer, by directly calling the play, pause APIs.
Now I want to separate the audio and video and play them as individual entities simultaneously. I want to do this because, I may do some editing to the audio or video track, and then play the file.
I can separate the two using AVAssetTracks, but I dont know how to play the tracks. Also, I would like to play the two tracks simultaneously, so that no AVSync problem occurs.
Please guide me how to achieve this target, i.e. audio and video rendering with no AVSync problem.
Thanks..
The easiest way to achieve this would be to have multiple player items. I would create a playerItem with all the tracks in their original form (ie the mp4 file). Then create another asset using AVMutableComposition (a subclass of AVAsset). This allows you to put only certain tracks into the composition (ie the audio track only). When you want to play the audio only, play the playerItem (AVPLayer setCurrentItem:) that has the mutable composition with the audio only track. When you want to play the video only, play the playerItem that has the mutable composition with the video only track. When you want to play both in sync, play the playerItem with the original asset.
I'm assuming you want to play the edited versions in sync. For that you will have to create another AVMutableComposition and add all of the edited tracks. Then call setCurrentItem: with the newly created AVMutableComposition.
If all you are trying to do is edit the different tracks, and never have to play them by themselves, you can do this with one single AVMutableComposition. Just add an audio and video AVMutableCompositionTrack and edit until your hearts content. They will always play in sync no matter how much you edit them separately (assuming your editing logic is correct). Just make sure you don't try to edit while playing. For that, you must create a copy and edit the copy.