Playback buffer of audio values - iOS - objective-c

I'm trying to playback an NSArray of sample audio values that I have created in Objective-C. I've generated the values with no problem, but how do I go about playing them back through an iPhone?
Thanks

I would suggest converting the NSArray to an NSData object, then using the AVAudioPlayer method initWithData:error: (see here) to load as playable audio. AVAudioPlayer has the advantage of being extremely simple to use in relation to Audio Queue and Audio Unit methods.
How you go about converting NSArray to NSData depends on the type of your samples (this SO post might give an idea of how this could be done, although the NSKeyedArchiver archiving process might screw around with your samples). I would suggest, if your sample generation process allows, just creating your samples into an NSData object and skip such a conversion.

Copy the values out of the NSArray into an C array of appropriately scaled PCM samples. Then you can convert this array to a WAV file by prepending a RIFF header and play this file using an AVAudioPlayer, or you could directly feed the PCM samples into the C array buffers of an Audio Queue or the RemoteIO Audio Unit in their audio callbacks.

Related

How to Get Audio sample data from mp3 using NAudio

I have an mp3 file into one large array of audio samples.
I want the audio samples to be floats.
NAudio.Wave.WaveStream pcm=NAudio.Wave.WaveFormatConversionStream.CreatePcmStream(new NAudio.Wave.Mp3FileReader(OFD.FileName));
so far I get the pcm stream and can play that back fine but I don't know how to read the raw data out of the stream.
Use AudioFileReader. This implements ISampleProvider so the Read method allows you to read directly into a float array of samples.
Alternatively use the ToSampleProvider method after your Mp3FileReader. You don't need to use WaveFormatConversionStream, since Mp3FileReader (and AudioFileReader) already decompress the MP3 frames.

Looping AVAudioPlayer w/o Gap

I'm recording a sound using AVAudioRecorder and then attempting to play back the sound using AVAudioPlayer. I'm trying to get the sound to loop indefinitely, but the sound has a short gap in between loops. I've tried recording the AVAudioRecorder recording to all possible file types, yet I can't find something that will allow seamless looping. Thanks.
This is a great post that helped me eliminate loops in my AVAudioPlayer implementation: http://forums.macrumors.com/showthread.php?t=640862
the gist of the post is that compressed audio adds blank sound to pad out the length of the sample to an even multiple of 1024. Using uncompressed audio, or audio that is specifically output for the purpose of looping will eliminate the glitch.

Convert from AIFF to AAC using Apple API only

I am creating a movie file using QTMovie from QTKit and everything's working nicely. The only problem I have is that the audio in the resulting MOV file is just the raw AIFF hence the file size is larger than I'd like. I've seen plenty about third party libraries capable of encoding to AAC but are there any Apple APIs which I can call to do this job? I don't mind converting the AIFF to AAC prior to adding it to my QTMovie or having the encoding done as part of writing the QTMovie to disk.
This was actually easily achievable using QTKit. I just needed to set the QTMovieExportType to 'mpg4' and QTMovieExport to be YES when calling writeToFile:withAttributes:.

Objective-C play sound

I know how to play mp3 files and whatnot in Xcode iOS. But how do I play a certain frequency, like if I just wanted to emit a C# note for 25 seconds; how might I do that? (The synth isn't as important to me as just the pitch of the note.)
You need to generate the PCM audio waveform that corresponds to the note you want to play and store that into a sample buffer in memory. Then you send that buffer to the audio hardware.
Here is a tutorial on generating waveforms of several types. The article goes into some details on the many aspects to a note you need to consider, including the frequency, volume, waveform shape, sampling rate, etc. The article comes with Flash source code, I think you should have no problem taking the concepts and adapting them to iOS.
If you also need a library that you can use to play the generated buffers on iOS, then I recommend the open source Finch.
I hope this helps!
You can synthesize waveforms of your desired frequency and feed them to the callbacks of either the Audio Queue or the RemoteIO Audio Unit API.
Here is a short tutorial on some of the code needed to create sine wave tones for iOS in C.

Solving a producer-consumer problem with NSData (for audio streaming)

I am using AVAssetReader to copy PCM data from an iPod track to a buffer, which is then played with a RemoteIO audio unit. I am trying to create a separate thread for loading sound data, so that I can access and play data from the buffer while it is being loaded into.
I currently have a large NSMutableData object that eventually holds the entire song's data. Currently, I load audio data in a separate thread using NSOperation like so:
AVAssetReaderOutput copies, at most, 8192 bytes at a time to a CMBlockBuffer
Copy these bytes to a NSData object
Append this NSData object to a larger NSMutableData object (which eventually holds the entire song)
When finished, play the song by accessing each packet in the NSMutableData object
I'm trying to be able to play the song WHILE copying these bytes. I am unsure what a good way to write to and read from a file from the same time is.
A short idea I had:
Create and fill 3 NSData objects, each 8192 bytes in length, as buffers.
Start playing. When I have finished playing the first buffer, load new data into the first buffer.
When I have finished playing the second buffer, load new data into the second. Same for the third
Start playing from the first buffer again, fill the third. And so on.
Or, create one NSData object that holds 3 * 8192 PCM units, and somehow write to and read from it at the same time with two different threads.
I have my code working on two different threads right now. I append data to the array until I press play, at which point it stops (probably because the thread is blocked, but I don't know right now) and plays until it reaches the end of whatever I loaded and causes an EXC_BAD_ACCESS exception.
In short, I want to find the right way to play PCM data while it is being copied, say, 8192 bytes at a time. I will probably have to do so with another thread (I am using NSOperation right now), but am unclear on how to write to and read from a buffer at the same time, preferably using some higher level Objective-C methods.
I'm doing this exact thing. You will definitely need to play your audio on a different thread (I am doing this with RemoteIO). You will also need to use a circular buffer. You probably want to look up this data structure if you aren't familiar with it as you will be using it a lot for this type of operation. My general setup is as follows:
LoadTrackThread starts up and starts loading data from AVAssetReader and storing it in a file as PCM.
LoadPCMThread starts up once enough data is loaded into my PCM file and essentially loads that file into local memory for my RemoteIO thread on demand. It does this by feeding this data into a circular buffer whenever my RemoteIO thread gets even remotely close to running out of samples.
RemoteIO playback callback thread consumes the circular buffer frames and feeds them to the RemoteIO interface. It also informs LoadPCMThread to wake up when it needs to start loading more samples.
This should be about all you need as far as threads. You will need to have some sort of mutex or semaphore between the two threads to ensure you aren't trying to read your file while you are writing into it at the same time (this is bad form and will cause you to crash). I just have both my threads set a boolean and sleep for a while until it is unset. There is probably a more sophisticated way of doing this but it works for my purposes.
Hope that helps!