AVAudioPCMBuffer for music files - objective-c

I've been trying to play music in my SpriteKit game and used the AVAudioPlayerNode class to do so via AVAudioPCMBuffers. Every time I exported my OS X project, it would crash and give me an error regarding audio playback. After banging my head against the wall for the last 24 hours I decided to re-watch WWDC session 501 (see 54:17). My solution to this problem was what the presenter used, which is to break the frames of the buffer into smaller pieces to break up the audio file being read.
NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;
AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
AVAudioFramePosition readPosition = audioFile.framePosition;
if (![audioFile readIntoBuffer: readBuffer error: &error])
return NO;
if (readBuffer.frameLength == 0) //end of file reached
break;
}
My current problem is that the player only plays the last frame read into the buffer. The music that I'm playing is only 2 minutes long. Apparently, this is too long to just read into the buffer outright. Is the buffer being overwritten every time the readIntoBuffer: method is called inside the loop? I'm such a noob at this stuff...how can I get the entire file played?
If I can't get this to work, what is a good way to play music (2 different files) across multiple SKScenes?

This is the solution that I came up with. It's still not perfect, but hopefully it will help someone who is in the same predicament that I've found myself in. I created a singleton class to handle this job. One improvement that can be made in the future is to only load sound effects and music files needed for a particular SKScene at the time they are needed. I had so many issues with this code that I don't want to mess with it now. Currently, I don't have too many sounds, so it's not using an excessive amount of memory.
Overview
My strategy was the following:
Store the audio file names for the game in a plist
Read from that plist and create two dictionaries (one for music and one for short sound effects)
The sound effect dictionary is composed of a AVAudioPCMBuffer and a AVAudioPlayerNode for each of the sounds
The music dictionary is compose of an array of AVAudioPCMBuffers, an array of timestamps for when those buffers should be played in queue, a AVAudioPlayerNode and the sample rate of the original audio file
The sample rate is necessary for figuring out the time at which each buffer should be played (you'll see the calculations done in code)
Create an AVAudioEngine and get the main mixer from the engine and attach all AVAudioPlayerNodes to the mixer (as per usual)
Play sound effects or music using their various methods
sound effect playing is straightforward...call method -(void) playSfxFile:(NSString*)file;
and it plays a sound
for music, I just couldn't find a good solution without invoking the help of the scene trying to play the music. The scene will call -(void) playMusicFile:(NSString*)file;and it will schedule the buffers to play in order that they were created. I couldn't find a good way to get the music to repeat once completed within my AudioEngine class so I decided to get the scene to check in its update: method whether or not the music was playing for a particular file and if not, play it again (not a very slick solution, but it works)
AudioEngine.h
#import <Foundation/Foundation.h>
#interface AudioEngine : NSObject
+(instancetype)sharedData;
-(void) playSfxFile:(NSString*)file;
-(void) playMusicFile:(NSString*)file;
-(void) pauseMusic:(NSString*)file;
-(void) unpauseMusic:(NSString*)file;
-(void) stopMusicFile:(NSString*)file;
-(void) setVolumePercentages;
-(bool) isPlayingMusic:(NSString*)file;
#end
AudioEngine.m
#import "AudioEngine.h"
#import <AVFoundation/AVFoundation.h>
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)
#interface AudioEngine()
#property AVAudioEngine *engine;
#property AVAudioMixerNode *mixer;
#property NSMutableDictionary *musicDict;
#property NSMutableDictionary *sfxDict;
#property NSString *audioInfoPList;
#property float musicVolumePercent;
#property float sfxVolumePercent;
#property float fadeVolume;
#property float timerCount;
#end
#implementation AudioEngine
int const FADE_ITERATIONS = 10;
static NSString * const MUSIC_PLAYER = #"player";
static NSString * const MUSIC_BUFFERS = #"buffers";
static NSString * const MUSIC_FRAME_POSITIONS = #"framePositions";
static NSString * const MUSIC_SAMPLE_RATE = #"sampleRate";
static NSString * const SFX_BUFFER = #"buffer";
static NSString * const SFX_PLAYER = #"player";
+(instancetype) sharedData {
static AudioEngine *sharedInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
sharedInstance = [[self alloc] init];
[sharedInstance startEngine];
});
return sharedInstance;
}
-(instancetype) init {
if (self = [super init]) {
_engine = [[AVAudioEngine alloc] init];
_mixer = [_engine mainMixerNode];
_audioInfoPList = [[NSBundle mainBundle] pathForResource:#"AudioInfo" ofType:#"plist"]; //open a plist called AudioInfo.plist
[self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
[self initMusic];
[self initSfx];
}
return self;
}
//opens all music files, creates multiple buffers depending on the length of the file and a player
-(void) initMusic {
_musicDict = [NSMutableDictionary dictionary];
_audioInfoPList = [[NSBundle mainBundle] pathForResource: #"AudioInfo" ofType: #"plist"];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *musicFileName in audioInfoData[#"music"]) {
[self loadMusicIntoBuffer:musicFileName];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
[_engine connect:player to:_mixer format:buffer.format];
[_musicDict[musicFileName] setObject:player forKey:#"player"];
}
}
//opens a music file and creates an array of buffers
-(void) loadMusicIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:#"aif"];
//NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"aif"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
[_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];
NSMutableArray *buffers = [NSMutableArray array];
NSMutableArray *framePositions = [NSMutableArray array];
const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
[framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
if (![audioFile readIntoBuffer:readBuffer error:&error]) {
NSLog(#"failed to read audio file: %#", error);
return;
}
if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
break;
}
[buffers addObject:readBuffer];
}
[_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
[_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
}
-(void) initSfx {
_sfxDict = [NSMutableDictionary dictionary];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *sfxFileName in audioInfoData[#"sfx"]) {
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
[self loadSoundIntoBuffer:sfxFileName];
AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
[_engine connect:player to:_mixer format:buffer.format];
[_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
}
}
//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
-(void) loadSoundIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"mp3"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
[audioFile readIntoBuffer:readBuffer error:&error];
[_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
}
-(void)startEngine {
[_engine startAndReturnError:nil];
}
-(void) playSfxFile:(NSString*)file {
AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:#"player"];
AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
[player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
[player setVolume:1.0];
[player setVolume:_sfxVolumePercent];
[player play];
}
-(void) playMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying] == NO) {
NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];
double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];
for (int i = 0; i < [buffers count]; i++) {
long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];
AVAudioPCMBuffer *buffer = [buffers objectAtIndex:i];
[player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
if (i == [buffers count] - 1) {
[player stop];
}
}];
[player setVolume:_musicVolumePercent];
[player play];
}
}
}
-(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
if ([file isEqualToString:#"menuscenemusic"]) {
AVAudioPlayerNode *player = [_musicDict[#"levelscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
else {
AVAudioPlayerNode *player = [_musicDict[#"menuscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
}
//stops the player for a particular sound
-(void) stopMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
_timerCount = FADE_ITERATIONS;
_fadeVolume = _musicVolumePercent;
[self fadeOutMusicForPlayer:player]; //fade out the music
}
}
//helper method for stopMusicFile:
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:#selector(handleTimer:) userInfo:player repeats:YES];
}
//helper method for stopMusicFile:
-(void) handleTimer:(NSTimer*)timer {
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
if (_timerCount > 0) {
_timerCount--;
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
_fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
[player setVolume:_fadeVolume];
}
else {
[player stop];
[player setVolume:_musicVolumePercent];
[timer invalidate];
}
}
-(void) pauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
[player pause];
}
}
-(void) unpauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player play];
}
//sets the volume of the player based on user preferences in GameData class
-(void) setVolumePercentages {
NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:#"musicVolume"];
_musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:#"sfxVolume"];
_sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
//immediately sets music to new volume
for (NSString *file in [_musicDict allKeys]) {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player setVolume:_musicVolumePercent];
}
}
-(bool) isPlayingMusic:(NSString *)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying])
return YES;
return NO;
}
#end

Related

Getting frames though a stream and display on screen

I have a requirement of streaming from server and displaying the streamed content on the screen...Streaming is working fine using NSStream, and NSInputStream and NSOutputStream.How can I display it on the screen?
Stream used looks like #"http://191.168.143.41:1212/;
if(stream == inputStream) {
uint8_t buf[1024];
unsigned int len = 0;
len = [inputStream read:buf maxLength:1024];
if(len > 0) {
NSMutableData* datas=[[NSMutableData alloc] initWithLength:0];
[datas appendBytes: (const void *)buf length:len];
NSString *s = [[NSString alloc] initWithData:datas encoding:NSASCIIStringEncoding];
[self readIn:s];
NSLog(#"ss%#",s);
[self loadMovie:s]; //method for movie player
}
I tried to display this is in a movieplayer as below..
-(void_loadMovie:(NSString*)moviePrefix
{
NSString *path = [NSString stringWithFormat:#"%#.mjpg", moviePrefix];
NSURL *url = [NSURL fileURLWithPath:path];
if (url) {
_moviePlayer = [[MPMoviePlayerController alloc] initWithContentURL:url];
_moviePlayer.view.frame = CGRectMake(0, 70, 600, 450);
_moviePlayer.controlStyle = MPMovieControlStyleNone;
_moviePlayer.scalingMode = MPMovieScalingModeNone;
[dic setObject:__moviePlayer forKey:path];
}
}
[_moviePlayer prepareToPlay];
[self.view addSubview: _moviePlayer.view];
[self.view bringSubviewToFront:_moviePlayer.view];
[self.view addSubview: _moviePlayer.view];
[_moviePlayer play];
}
Is NSString *path = [NSString stringWithFormat:#"%#.mjpg", moviePrefix]; correct way??
This displays a black screen.What is wrong?
If this way is not correct,Is there any other way I can display those frames?
Can anyone help me to solve this...
MJPEG are only JPEG sent one after the other.
I worked a few years ago on this.
On a version of iOS (iOS5?), it was easily read with a UIWebView, but an update of iOS broke all this. This broke all my current work.
Maybe a UIWebView could do the trick today again (fix).
Anyway, since it's just bunch of JPEG, you could just read the JPG (detect start/end of JPG file), create the JPG image and show it in a UIImageView.
A work around (not tested), but you should get the whole idea:
//Properties
#property (nonatomic, strong) NSMutableData *data;
#property (nonatomic, weak) IBOutlet UIImageView *streamImageView;
//Initialize somewhere
_data = [[NSMutableData alloc] init];
//In the stream delegate method:
//Start JPG: FFD8 — End JPG: FFD9
UInt8 startJPEGBytes[2];
startJPEGBytes[0] = 0xFF;
startJPEGBytes[1] = 0xD8;
NSData *startData = [NSData dataWithBytes:&startJPEGBytes length:2];
UInt8 endJPEGBytes[2];
endJPEGBytes[0] = 0xFF;
endJPEGBytes[0] = 0xD9;
NSData *endData = [NSData dataWithBytes:&endJPEGBytes length:2];
[_data appendBytes: (const void *)buf length:len];
NSRange startRange = [_data rangeOfData:startData options:0 range:NSMakeRange(0, [_data length])];
if (startRange.location != NSNotFound) //We found the start of a JPEG
{
NSRange endRange = [_data rangeOfData:endData options:0 range:NSMakeRange(startRange.location, [_data length]-startRange.location)];
if (endRange.location != NSNotFound) //We found the end of a JPEG
{
NSRange imageRange = NSMakeRange(startRange.location, endRange.location+endRange.length-startRange.location);
NSData *imageData = [_data subDataWithRange: imageRange];
streamImage = [UIImage imageWithData:imageData];
[_streamImageView setImage:streamImage];
[_data replaceBytesInRange:NSMakeRange(0, imageRange.location+imageRange.length withBytes:NULL length:0]; //We remove the start till the end of JPEG frame. Start at 0, since there could be garbage at the start.
}
}
You are not adding moviePrefix to the string
NSString *path = [NSString stringWithFormat:#".mjpg", moviePrefix, #"movie"];
Change it to
NSString *path = [NSString stringWithFormat:#"%#.mjpg", moviePrefix, #"movie"];
https://github.com/horsson/mjpeg-iphone/tree/55251a85e2c2489014036ddf5a491783f9b1962d
Used this to get the stream and display.It works

iOS 7 NSURLSession Download multiple files in Background

I want to download a List of files using NSUrlSession.
I have a variable for counting the successful downloads #property (nonatomic) int downloadsSuccessfulCounter;. While the files are being downloaded I disable the Download Button. When the counter is equal to the download list size, I enable the button again and set the counter to 0. I do this in the method:
-(void)URLSession:(NSURLSession *)session downloadTask:(NSURLSessionDownloadTask *)downloadTask didFinishDownloadingToURL:(NSURL *)location {
...
[[NSOperationQueue mainQueue] addOperationWithBlock:^ {
downloadsSuccessfulCounter++;
if(downloadsSuccessfulCounter == self.downloadList.count) {
NSLog(#"All downloads finished");
[self.syncButton setEnabled:YES];
downloadsSuccessfulCounter = 0;
}
}];
}
Everything is working fine, but when I open again the ViewController I get the message A background URLSession with identifier com.myApp already exists!. The counter is not set to 0 and the UI elements (UIButtons, UILabels) are not responding.
I guess the problem is because the NSURLSession is still open but I'm not really sure about how it works.
I have tried all the tutorials, but 99% of them are only for downloading 1 file, not more than 1...
Any ideas?
Here is my code:
...
#property (nonatomic, strong) NSURLSession *session;
...
- (void)viewDidLoad {
[super viewDidLoad];
appDelegate = (AppDelegate *)[[UIApplication sharedApplication] delegate];
self.downloadList = [[NSMutableArray alloc] init];
NSURLSessionConfiguration *sessionConfiguration = [NSURLSessionConfiguration backgroundSessionConfiguration:#"com.myApp"];
sessionConfiguration.HTTPMaximumConnectionsPerHost = 5;
self.session = [NSURLSession sessionWithConfiguration:sessionConfiguration delegate:self delegateQueue:nil];
}
When I press the Download ButtonI call this method (
I have a Downloadable object which contains a NSURLSessionDownloadTask):
-(void)startDownload {
for (int i=0; i<[self.downloadList count]; i++) {
Downloadable *d = [self.downloadList objectAtIndex:i];
if (!d.isDownloading) {
if (d.taskIdentifier == -1) {
d.downloadTask = [self.session downloadTaskWithURL:[NSURL URLWithString:d.downloadSource]];
}else {
d.downloadTask = [self.session downloadTaskWithResumeData:fdi.taskResumeData];
}
d.taskIdentifier = d.downloadTask.taskIdentifier;
[d.downloadTask resume];
d.isDownloading = YES;
}
}
}
When the app is in Background:
-(void)URLSessionDidFinishEventsForBackgroundURLSession:(NSURLSession *)session{
AppDelegate *appDelegate = [UIApplication sharedApplication].delegate;
[self.session getTasksWithCompletionHandler:^(NSArray *dataTasks, NSArray *uploadTasks, NSArray *downloadTasks) {
if ([downloadTasks count] == 0) {
if (appDelegate.backgroundTransferCompletionHandler != nil) {
void(^completionHandler)() = appDelegate.backgroundTransferCompletionHandler;
appDelegate.backgroundTransferCompletionHandler = nil;
[[NSOperationQueue mainQueue] addOperationWithBlock:^{
completionHandler();
UILocalNotification *localNotification = [[UILocalNotification alloc] init];
localNotification.alertBody = #"All files downloaded";
[[UIApplication sharedApplication] presentLocalNotificationNow:localNotification];
}];
}
}
}];
}
So, as I mentioned in my comments, the issue is that each File requires a unique NSURLSession, and each NSURLSession requires a NSURLSessionConfiguration with a unique identifier.
I think that you were close - and probably more proper than me in certain aspects...
You just need to create a structure to pass unique IDs into unique Configurations, to populate unique Sessions (say that 10x fast).
Here's what I did:
/*
* Retrieves the List of Files to Download
* Also uses the size of that list to instantiate items
* In my case, I load a character returned text file with the names of the files that I want to download
*/
- (void) getMediaList {
NSString *list = #"http://myserver/media_list.txt";
NSURLSession *session = [NSURLSession sharedSession]; // <-- BASIC session
[[session dataTaskWithURL:[NSURL URLWithString:list]
completionHandler:^(NSData *data, NSURLResponse *response, NSError *error) {
NSString *stringFromData = [[NSString alloc] initWithData: data encoding: NSUTF8StringEncoding];
// Populate Arrays
REMOTE_MEDIA_FILE_PATHS = [stringFromData componentsSeparatedByString:#"\n"];
[self instantiateURLSessions:[REMOTE_MEDIA_FILE_PATHS count]];
// Start First File
[self getFile:[REMOTE_MEDIA_FILE_PATHS objectAtIndex:downloadCounter]:downloadCounter]; // this variable is 0 at the start
}]
resume];
}
/*
* This sets Arrays of Configurations and Sessions to the proper size
* It also gives a unique ID to each one
*/
- (void) instantiateURLSessions : (int) size {
NSMutableArray *configurations = [NSMutableArray array];
NSMutableArray *sessions = [NSMutableArray array];
for (int i = 0; i < size; i++) {
NSString *index = [NSString stringWithFormat:#"%i", i];
NSString *UniqueIdentifier = #"MyAppBackgroundSessionIdentifier_";
UniqueIdentifier = [UniqueIdentifier stringByAppendingString:index];
[configurations addObject: [NSURLSessionConfiguration backgroundSessionConfigurationWithIdentifier:UniqueIdentifier]];
[sessions addObject:[NSURLSession sessionWithConfiguration: [configurations objectAtIndex:i] delegate: self delegateQueue: [NSOperationQueue mainQueue]]];
}
NSURL_BACKGROUND_CONFIGURATIONS = [NSArray arrayWithArray:configurations];
NSURL_BACKGROUND_SESSIONS = [NSArray arrayWithArray:sessions];
}
/*
* This sets up the Download task for each file, based off of the index of the array
* It also concatenates the path to the actual file
*/
- (void) getFile : (NSString*) file :(int) index {
NSString *fullPathToFile = REMOTE_MEDIA_PATH; // Path To Server With Files
fullPathToFile = [fullPathToFile stringByAppendingString:file];
NSURL *url = [NSURL URLWithString:fullPathToFile];
NSURLSessionDownloadTask *downloadTask = [[NSURL_BACKGROUND_SESSIONS objectAtIndex:index ] downloadTaskWithURL: url];
[downloadTask resume];
}
/*
* Finally, in my delegate method, upon the completion of the download (after the file is moved from the temp data), I check if I am done and if not call the getFiles method again with the updated counter for the index
*/
-(void)URLSession:(NSURLSession *)session downloadTask:(NSURLSessionDownloadTask *)downloadTask didFinishDownloadingToURL:(NSURL *)location
{
// Get the documents directory URL
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *dataPath = [documentsDirectory stringByAppendingPathComponent:LOCAL_MEDIA_PATH];
NSURL *customDirectory = [NSURL fileURLWithPath:dataPath];
// Get the file name and create a destination URL
NSString *sendingFileName = [downloadTask.originalRequest.URL lastPathComponent];
NSURL *destinationUrl = [customDirectory URLByAppendingPathComponent:sendingFileName];
// Move the file
NSError *error = nil;
NSFileManager *fileManager = [NSFileManager defaultManager];
if ([fileManager moveItemAtURL:location toURL:destinationUrl error: &error]) {
// List
[self listCustomDirectory];
if(downloadCounter < [REMOTE_MEDIA_FILE_PATHS count] -1) {
// Increment Counter
downloadCounter++;
// Start Next File
[self getFile:[REMOTE_MEDIA_FILE_PATHS objectAtIndex:downloadCounter]:downloadCounter];
}
else {
// FINISH YOUR OPERATION / NOTIFY USER / ETC
}
}
else {
NSLog(#"Damn. Error %#", error);
// Do Something Intelligent Here
}
}

NSManagedObject fail to save it's attributes, but able to save when adding related objects

I'm developing an iOS app using Core Data. And I have a Log entity with one-to-many relationships with Audio, Photo entities, and one-to-one relationship with Status entity. The log also has text, longitude, latitude properties. I can create the log, change its properties, add status entity, these changes would display right, until I quit the App. All the changes would disappear, and I was looking at the sqlite database, all these changes were never persisted in the database. In the database, the status object will just be created, but not linked to the log object.
But if I add an audio or photo object into the log.audioSet or log.photoSet, the changes I made to log, including the changes to text or status, will suddenly be saved into the database.
So it seems the changes are only maintained in the NSManagedObjectContext, until a related one_to_many entity is added and the [[LTLogStore sharedStore] saveChanges] will suddenly start to work.
I am using a singleton to manage the NSManagedObjectContext. Any ideas?
I would post some code if it's relevant. Thanks.
UPDATE: I'm not sure these code is enough. But basically everything works, and displays, it just doesn't save to the database. I'm using the mogenerator to set the text and latitude, but since everything is in the context. I am not sure this is the code you might need.
CODE:
#interface LTLogStore : NSObject{
}
+ (LTLogStore *)sharedStore;
- (void)removeItem:(Log *)p;
- (Log *)createItem;
- (BOOL)saveChanges;
#property(nonatomic, strong) NSFetchedResultsController *resultsController;
#property(nonatomic, strong) NSManagedObjectModel *model;
#property(nonatomic, strong) NSManagedObjectContext *context;
#end
#implementation LTLogStore
#synthesize resultsController;
#synthesize context, model;
+ (LTLogStore *)sharedStore
{
static LTLogStore *sharedStore = nil;
if(!sharedStore){
sharedStore = [[super allocWithZone:nil] init];
}
return sharedStore;
}
+ (id)allocWithZone:(NSZone *)zone
{
return [self sharedStore];
}
- (id)init
{
self = [super init];
if(self) {
model = [NSManagedObjectModel mergedModelFromBundles:nil];
NSPersistentStoreCoordinator *psc =
[[NSPersistentStoreCoordinator alloc] initWithManagedObjectModel:model];
// Where does the SQLite file go?
NSString *path = [self itemArchivePath];
NSURL *storeURL = [NSURL fileURLWithPath:path];
NSError *error = nil;
if (![psc addPersistentStoreWithType:NSSQLiteStoreType
configuration:nil
URL:storeURL
options:nil
error:&error]) {
[NSException raise:#"Open failed"
format:#"Reason: %#", [error localizedDescription]];
}
// Create the managed object context
context = [[NSManagedObjectContext alloc] init];
[context setPersistentStoreCoordinator:psc];
// The managed object context can manage undo, but we don't need it
[context setUndoManager:nil];
}
return self;
}
- (NSFetchedResultsController *)resultsController {
if (resultsController !=nil) {
return resultsController;
}
NSFetchRequest *request = [[NSFetchRequest alloc] init];
NSEntityDescription *e = [[model entitiesByName] objectForKey:#"Log"];
[request setEntity:e];
NSSortDescriptor *sd = [NSSortDescriptor
sortDescriptorWithKey:#"created_at"
ascending:NO];
[request setSortDescriptors:[NSArray arrayWithObject:sd]];
[request setReturnsObjectsAsFaults:NO];
NSFetchedResultsController *fetchedResultsController = [[NSFetchedResultsController alloc]
initWithFetchRequest:request
managedObjectContext:context
sectionNameKeyPath:nil cacheName:#"Root"];
NSError *error;
BOOL success = [fetchedResultsController performFetch:&error];
if (!success) {
//handle the error
}
return fetchedResultsController;
}
- (NSString *)itemArchivePath
{
NSArray *documentDirectories =
NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,
NSUserDomainMask, YES);
// Get one and only document directory from that list
NSString *documentDirectory = [documentDirectories objectAtIndex:0];
NSString *storePath = [documentDirectory stringByAppendingPathComponent:#"store.data"];
return storePath;
}
- (BOOL)saveChanges
{
NSError *err = nil;
BOOL successful = [context save:&err];
NSLog(#"Saving changes to the database");
if (!successful) {
NSLog(#"Error saving: %#", [err localizedDescription]);
}
return successful;
}
- (void)removeItem:(Log *)l
{
[context deleteObject:l];
[self saveChanges];
}
- (Log *)createItem
{
Log *p = [NSEntityDescription insertNewObjectForEntityForName:#"Log"
inManagedObjectContext:context];
[self saveChanges];
return p;
}
#end
#interface Log : _Log {
}
//these two are some custom convenience methods for location attributes, but it does the work of setting the longitude and latitude value in the log object, but calling the [[LTLogStore sharedStore] saveChanges] still won't save it into the database.
-(CLLocation*)location;
-(void)setLocation:(CLLocation*)location;
//this all works
-(Audio*)newAudio;
-(Audio*)newAudioWithPath:(NSString*)audioPath;
//after calling this method, even the log.text changes will be saved to the database.
-(void)addAudioWithPath:(NSString*)audioPath;
-(void)removeAudio:(Audio*)audio;
#end
#import "Log.h"
#import "Audio.h"
#import "LTLogStore.h"
#implementation Log
-(CLLocation*)location{
if (!self.longitude || !self.latitude) {
return nil;
}
CLLocation *l = [[CLLocation alloc] initWithLatitude:[self.latitude doubleValue] longitude:[self.longitude doubleValue]];
return l;
}
-(void)setLocation:(CLLocation*)location{
if (location==nil) {
self.latitude = nil;
self.longitude = nil;
}
self.latitude = [NSNumber numberWithDouble: location.coordinate.latitude];
self.longitude = [NSNumber numberWithDouble:location.coordinate.longitude];
[[LTLogStore sharedStore] saveChanges];
}
-(Audio*)newAudio{
Audio *a = [Audio new];
a.log = self;
return a;
}
-(Audio*)newAudioWithPath:(NSString*)audioPath{
Audio *new = [self newAudio];
[new setKey:audioPath];
return new;
}
-(void)addAudioWithPath:(NSString*)audioPath{
Audio *new = [self newAudio];
[new setKey:audioPath];
[[LTLogStore sharedStore] saveChanges];
}
-(void)removeAudio:(Audio*)audio{
[self.audiosSet removeObject:audio];
[[[LTLogStore sharedStore] context] deleteObject:audio];
[[LTLogStore sharedStore] saveChanges];
}
#end
UPDATE:
Problem solved, see answer.
UPDATE QUESTION: Why is my overriding causing the problem? Can someone explain the cause behind the magic of Core Data or maybe KVO behind scene?
Problem solved, I overrode the willChangeValueForKey method in the Log class, which caused the problem, I thought the code is irrelevant. But it IS:
- (void)willChangeValueForKey:(NSString *)key{
//I added the following line to fix my problem
[super willChangeValueForKey:key];
//this is the original line, I want to have this
//because I want to have a isBlank property
//so I can see if the user modified the log
_isBlank = false;
//I tried to also add the following line to be safe.
//turns out this line is not needed, and it will make the problem occur again
//[super didChangeValueForKey:key];
}

IOS can I use AVAudioPlayer on the appDelegate?

I have a TabBarController with two tabs and I want to play music on both tabs. Right now I have my code on the main appDelegate
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"My Song"
ofType:#"m4a"]]; // My Song.m4a
NSError *error;
self.audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:url
error:&error];
if (error)
{
NSLog(#"Error in audioPlayer: %#",
[error localizedDescription]);
} else {
//audioPlayer.delegate = self;
[audioPlayer prepareToPlay];
}
but I'm getting the error Program received signal: "SIGABRT" on UIApplicationMain
Is there a better way to accomplish what I'm trying to do? If this is how I should do it, where do I start checking for problems?
yes you can use AVAudioPlayer in App Delegate.
What you need to do is:-
In appDelegate.h file do:-
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
AVAudioPlayer *_backgroundMusicPlayer;
BOOL _backgroundMusicPlaying;
BOOL _backgroundMusicInterrupted;
UInt32 _otherMusicIsPlaying;
Make backgroundMusicPlayer property and sythesize it.
In appDelegate.m file do:-
Add these lines in did FinishLaunching method
NSError *setCategoryError = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:&setCategoryError];
// Create audio player with background music
NSString *backgroundMusicPath = [[NSBundle mainBundle] pathForResource:#"SplashScreen" ofType:#"wav"];
NSURL *backgroundMusicURL = [NSURL fileURLWithPath:backgroundMusicPath];
NSError *error;
_backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
[_backgroundMusicPlayer setDelegate:self]; // We need this so we can restart after interruptions
[_backgroundMusicPlayer setNumberOfLoops:-1]; // Negative number means loop forever
Now implement delegate methods
#pragma mark -
#pragma mark AVAudioPlayer delegate methods
- (void) audioPlayerBeginInterruption: (AVAudioPlayer *) player {
_backgroundMusicInterrupted = YES;
_backgroundMusicPlaying = NO;
}
- (void) audioPlayerEndInterruption: (AVAudioPlayer *) player {
if (_backgroundMusicInterrupted) {
[self tryPlayMusic];
_backgroundMusicInterrupted = NO;
}
}
- (void)tryPlayMusic {
// Check to see if iPod music is already playing
UInt32 propertySize = sizeof(_otherMusicIsPlaying);
AudioSessionGetProperty(kAudioSessionProperty_OtherAudioIsPlaying, &propertySize, &_otherMusicIsPlaying);
// Play the music if no other music is playing and we aren't playing already
if (_otherMusicIsPlaying != 1 && !_backgroundMusicPlaying) {
[_backgroundMusicPlayer prepareToPlay];
if (soundsEnabled==YES) {
[_backgroundMusicPlayer play];
_backgroundMusicPlaying = YES;
}
}
}

iOS 4 bug? Image files saved to folder are overwritten after a random number of images saved

I have an APP (SDK 4.3) which saves images being attachmemnts for a waypoint on a google map.
The file save is pretty standard (afaik) UIImagePickerController code.
Rather than saving to the camera roll I was saving the image and then the thumbnail to a subfolder. I need that.
At seemingly random points with no errors being trapped at all and logged, the images will not save to the folder but instead over-write previously saved image files!
It looks for all the world like a FIFO pop going on.
It is seriously odd and I have even built a small test APP and fired it up as soon as the spookiness appeared...saving a series of camera images to the same folders but see the same effect. The images get over-written once the random magic file number is reached!
Random in the sense that after 7 saved images, the overwriting begins...even after a reboot of the phone to ensure memory leaks is not the issue. Wipe the APP and try again...
This time it will happen after 16 oR 23 image files saved.
I have gone to all sorts of extremes and cannot find the source of the issue.
In the small test APP, in the same method I save out to the camera roll as well. It will save there but overwrite in the folder. The file names are 10 character random generated alpha-numeric.
I am now leaning to understand this as a bug. I can always reproduce the error but not predictably. It arises randomly.
I would appreciate help as I am tearing my hair out.
Here is the code...
//tester.h
#import <UIKit/UIKit.h>
#interface tester : UIViewController <UINavigationControllerDelegate, UIImagePickerControllerDelegate>
{
UIImagePickerController *imgPicker;
IBOutlet UIButton *pressit;
IBOutlet UIButton *seeya;
UIActivityIndicatorView *activity;
}
#property (retain )UIImagePickerController *imgPicker;
#property (nonatomic,retain)IBOutlet UIButton *pressit;
#property (nonatomic,retain)IBOutlet UIButton *seeya;
#property (nonatomic,retain)UIActivityIndicatorView *activity;
-(NSString *) genRandStringLength:(int) len ;
-(void)saveImagesFromPickerInTheBackgroundUsingImage:(UIImage *)img;
-(NSArray *)buildFilePaths;
- (IBAction)snapShots:(UIButton *)button;
-(IBAction)byebye:(id)sender;
#end
//=====================
//tester.m
#import "tester.h"
#import "MultiMediaUtilities.h"
#implementation tester
#synthesize imgPicker;
#synthesize pressit,seeya,activity;
//Image size constants
#define MAX_THUMBNAIL_RES_SIZE 103
#define MAX_IMAGE_RES_SIZE 640
- (IBAction)snapShots:(UIButton *)button
{
if (!imgPicker) imgPicker = [[UIImagePickerController alloc]init];
imgPicker.sourceType = UIImagePickerControllerSourceTypeCamera;
imgPicker.delegate = self;
[self presentModalViewController:imgPicker animated:YES];
}
- (void) imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
UIImage *memoImage = [[MultiMediaUtilities scaleAndRotateImage:[info objectForKey:#"UIImagePickerControllerOriginalImage"] toResolution:MAX_IMAGE_RES_SIZE ]retain];
UIImageWriteToSavedPhotosAlbum(memoImage, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
[self saveImagesFromPickerInTheBackgroundUsingImage:memoImage];
// Dismiss the camera
[self dismissModalViewControllerAnimated:YES];
}
//builds paths to files in system with components
-(NSArray *)buildFilePaths
{
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];
NSString *docsPath = [documentsDirectory stringByAppendingPathComponent:#"imagesfolder"];
NSString *fullDocsPath = [docsPath stringByAppendingPathComponent:#"assets"];
NSString *fullThumbsPath = [fullDocsPath stringByAppendingPathComponent:#"thumbs"];
NSArray * retArray = [NSArray arrayWithObjects:fullDocsPath,fullThumbsPath,nil];
return retArray;
}
-(void)saveImagesFromPickerInTheBackgroundUsingImage:(UIImage *)img
{
#try
{
NSFileManager *NSFm = [NSFileManager defaultManager];
NSArray *pathsArray = [NSArray arrayWithArray:[self buildFilePaths]];
NSString *fullDocsPath = [NSString stringWithFormat:#"%#", (NSString *)[pathsArray objectAtIndex:0]];
NSString *fullThumbsPath = [NSString stringWithFormat:#"%#", (NSString *)[pathsArray objectAtIndex:1]];
//Ensure Folders exist
BOOL isDir=YES;
NSError *error;
if(![NSFm fileExistsAtPath:fullDocsPath isDirectory:&isDir])
if(![NSFm createDirectoryAtPath:fullDocsPath withIntermediateDirectories:YES attributes:nil error:&error])
NSLog(#"Error: Create Images folder failed");
//create thumbs folder too
if(![NSFm fileExistsAtPath:fullThumbsPath isDirectory:&isDir])
if(![NSFm createDirectoryAtPath:fullThumbsPath withIntermediateDirectories:YES attributes:nil error:&error])
NSLog(#"Error: Create Thumbs folder failed");
//build the filenames & paths
NSString *newImageName= [NSString stringWithFormat:#"%#.png", [self genRandStringLength:10]];
NSString *imagePath = [[fullDocsPath stringByAppendingPathComponent:newImageName]retain];
NSLog(#"SavingIMage ImagePath = %#",imagePath);
NSString *thumbPath = [[fullThumbsPath stringByAppendingPathComponent:newImageName]retain];
NSLog(#"SavingIMage thumbPAth = %#",thumbPath);
//Write the files out
NSData *imgData = UIImagePNGRepresentation(img);
[imgData writeToFile:imagePath options:NSDataWritingAtomic error:&error];
if (!error) {
NSLog(#"Error writing image %#",error.description);
}
NSData *thumbData = UIImagePNGRepresentation(img);
[thumbData writeToFile:thumbPath options:NSDataWritingAtomic error:&error];
if (!error) {
NSLog(#"Error writing thumb %#",error.description);
}
}
#catch (NSException * e)
{
NSLog(#"Exception: %#", e);
}
}
-(NSString *) genRandStringLength:(int) len
{
NSString *letters = #"abcdefghijklmnopqrstuvwxyzABCDEFGHIJKLMNOPQRSTUVWXYZ0123456789";
NSMutableString *randomString = [NSMutableString stringWithCapacity: len];
for (int i=0; i<len; i++)
{
[randomString appendFormat: #"%c", [letters characterAtIndex: rand()%[letters length]]];
}
return randomString;
}
- (void)image:(UIImage*)image didFinishSavingWithError:(NSError *)error contextInfo:(NSDictionary*)info {
NSString *message;
NSString *title;
if (!error)
{
title = #"Camera...";
message = #"Image saved!...Just as well.";
}
else
{
title = #"Error";
message = [error description];
}
UIAlertView *alert = [[UIAlertView alloc]
initWithTitle:title
message:message
delegate:self
cancelButtonTitle:#"OK"
otherButtonTitles:nil];
[alert show];
[alert release];
if (image !=NULL){
[image release];
image=nil;
}
if(info !=NULL)
{
[info release];
info=nil;
}
}
- (void)viewDidUnload
{
[super viewDidUnload];
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
}
-(void)dealloc
{
[imgPicker release];
[pressit release];
[seeya release];
[activity release];
[super dealloc];
}
#end
Even seeded, this is an inappropriate use of random numbers.
Three approaches:
Use an incremented sequence number. (1, 2, 3, etc.)
Use a UUID from [[NSProcessInfo processInfo] globallyUniqueString]
Use a filename constructed from the date & time.
As Mats said, if you don't initialize your random number generator with srand, rand() will behave strangely and don't expect it to generate random numbers. This can cause the same filenames you experience.