record live streaming audio - objective-c

I'm actually making an app which has to play and record streaming audio from internet on ipad. The streaming of the audio is done, I will have to come to the recording part very soon and I don't have any idea on how to proceed.
Could you give me a hint??? Idea? It will have to play while simultaneously recording into AAC or MP3.
Thanks.

You'll need to use the lower-level AudioQueue API, and use the AudioSession API to set up the audio session.
Then you'll need to fill out an AudioStreamBasicDescription struct and create a new input queue with AudioQueueNewInput() and include your callback function for handling input buffers.
And then you'll need to create 3 buffers using AudioQueueAllocateBuffer() and AudioQueueEnqueueBuffer(). And only then will you be ready to call AudioQueueStart(). You should also handle audio session interruptions, and handle stopping the audio queue.
That will just get you a stream of buffers containing uncompressed 16-bit integer PCM audio data. You still need to compress the data, which is another can of worms that involves using the AudioConverter API, which I haven't done on the iPhone OS so I don't know what will work there.

Please Look at this Framework. It provide data for recording while recording Streaming Kit
Play an MP3 over HTTP
STKAudioPlayer* audioPlayer = [[STKAudioPlayer alloc] init];
[audioPlayer play:#"http://www.abstractpath.com/files/audiosamples/sample.mp3"];
And append its data to NSMutabledata to play offline by using this delegate.
Intercept PCM data just before its played
[audioPlayer appendFrameFilterWithName:#"MyCustomFilter" block:^(UInt32 channelsPerFrame, UInt32 bytesPerFrame, UInt32 frameCount, void* frames)
{
...
}];

Related

Can Isochronous stream and control signal methods are possible simultaneously in USB-OTG without any data corruption/ delay in video stream?

Here, Data transfer is for controlling video pause and video record. We are using iMX8Mini Eval board - for streaming video to Android Mobile via USB-OTG. We would like to know, whether video stream does not affected with any command sent over same USB-OTG.

iOS: stream to rtmp server from GPUImage

Is it possible to stream video and audio to a rtmp://-server with GPUImage?
I'm using the GPUImageVideoCamera and would love to stream (video + audio) directly to a rtmp-server.
I tried VideoCore which streams perfectly to e.g. YouTube, but whenever I try to overlay the video with different images I do get performance problems.
It seems as GPUImage is doing a really great job there, but I don't know how to stream with that. I found issues on VideoCore talking about feeding VideoCore with GPUImage, but I don't have a starting point on how that's implemented...

Demux HLS TS stream Split Audio AAC From Video

Trying to split HLS TS Stream audio from video, audio is AAC format.
The gole is to have some sort of AVAsset that I can later manipulate and then Mux back to the video.
After searching for a while i cant find a solid lead, can someone give me a educated direction to take on this issue ?
You can use the ffmpeg/libav library for demuxing the ts. To load the audio back as an AVAsset, it might be necessary to load it from a URL, either by writing temporarily to disk or serving with a local http server within your program.
I think you might run into some trouble in manipulating the audio stream, assuming you want to manipulate the raw audio data. That will require decoding the AAC, modifying it, re-encoding, and re-muxing with the video. That's all possible with ffmpeg/libav, but it's not really that easy.

iOS - Streaming large files for upload (application crashes when allocating too large files using NSData)

Introduction
I'm currently doing some bug fixes in an application which is in the style of Vimeo, that is, the user can record videos from the phone or iPad, and then upload the video for other users to see. The current problem have to do with uploading large files due to the way we are handling upload right now.
The Problem
So the problem is essentially that when uploading files to the server, we first allocate all the bytes that has to be uploaded in an NSData object. This string of bytes will then get attached to a standard HTTP Post message, and a receiving API will then handle it. The problem is that with large file sizes (which videos will quickly be), the app will simply crash because the NSData object takes up too much memory on the iDevice.
This is how the process works:
Byte *buffer = malloc(content.defaultRepresentation.size);
NSUInteger buffered = [content.defaultRepresentation getBytes:buffer fromOffset:0.0 length:content.defaultRepresentation.size error:nil];
NSData *data = [NSData dataWithBytesNoCopy:buffer length:buffered];
NSData *movieData = [NSData dataWithData:data];
I simply get the bytes from a video which has been saved in the iPhone or iPad's standard camera roll. Then I put these bytes into an NSData object. The string of bytes from this NSData object will then simply be attached to a simple HTTP post message and send to an API.
The Question
The question then is, and the problem I see, is that the entire byte string have to be send in a single HTTP Post message. So is there any way in which you can load in chunks of the movie file and append it to the post message, so you don't take up too much memory at a time? Or how could you go about doing this?
Thank you for your time :)
I would suggest you to use ASIHTTPRequest library. It can upload files directly from the phone instead of loading the file into memory first.The logic here is to upload the movie file by breaking it into parts,called as multiparts and then upload it in queue fashion

Symbian/S60 audio playback rate

I would like to control the playback rate of a song while it is playing. Basically I want to make it play a little faster or slower, when I tell it to do so.
Also, is it possible to playback two different tracks at the same time. Imagine a recording with the instruments in one track and the vocal in a different track. One of these tracks should then be able to change the playback rate in "realtime".
Is this possible on Symbian/S60?
It's possible, but you would have to:
Convert the audio data into PCM, if it is not already in this format
Process this PCM stream in the application, in order to change its playback rate
Render the audio via CMdaAudioOutputStream or CMMFDevSound (or QAudioOutput, if you are using Qt)
In other words, the platform itself does not provide any APIs for changing the audio playback rate - your application would need to process the audio stream directly.
As for playing multiple tracks together, depending on the device, the audio subsystem may let you play two or more streams simultaneously using either of the above APIs. The problem you may have however is that they are unlikely to be synchronised. Your app would probably therefore have to mix all of the individual tracks into one stream before rendering.