Using Same AudioPlayer Again and Again - just-audio

I want to use the same AudioPlayer instance for each play operation to avoid reloading assets each time. How can I achieve this?
Currently, I've created an AudioPlayer instance like below.
AudioPlayer _player = AudioPlayer();
await _player.setAsset("sounds/hit.mp3");
_player.play();
But after running the _player.play() function once, the sound no longer plays anymore on next invocations.

Try invoke _player.stop() before each play (reset the position).

Related

AVPlayer long song buffering issue

I've got an issue with long songs while using AVPlayer.
I have tested it on a 64 min song.
The issue is: when the AVPlayer buffer is full it stops playback (rate = 0.0f) and stops to download new timeRanges. When I resume playback manually it plays for some seconds and stops again. I think it continues to download new content to buffer but this process is very slow and is not suitable for gapless playback.
Is it possible to control this situation to achieve gapless playback?
Am I allowed to modify loaded time ranges (clean the buffer) during playback?
Am I allowed to increase buffer size?
Are you running it on the main thread? Try to do something like this:
#include <dispatch/dispatch.h>
dispatch_queue_t playQueue = dispatch_queue_create("com.example.playqueue", NULL);
AVAudioPlayer* player = ...
dispatch_async(playQueue, ^{
[player play];
});
If that doesn't work I'd suggest giving OpenAL a try.

multi track mp3 playback for iOS application

I am doing an application that involves playing back a song in a multi track format (drums, vocals, guitar, piano, etc...). I don't need to do any fancy audio processing to each track, all I need to be able to do is play, pause, and mute/unmute each track.
I had been using multiple instances of AVAudioPlayer but when performing device testing, I noticed that the tracks are playing very slightly out of sync when they are first played. Furthermore, when I pause and play the tracks they continue to get more out of sync. After a bit of research I've realized that AVAudioplayer just has too much latency and won't work for my application.
In my application I basically had an NSArray of AVAudioPlayers that I would loop through and play each one or pause/stop each one, I'm sure this is what caused it to get out of sync on the device.
It seemed like apple's audio mixer would work well for me, but when I try implementing it I get a EXC_BAD_ACCESS error that I can't figure out.
I know the answer is to use OpenAL or audio units but It just seems unnecessary to spend weeks learning about these when all I need to do is play around 5 .mp3 tracks at the same time. Does anyone have any suggestions on how to accomplish this? Thanks
thanks to admsyn's suggestion I was able to come up with a solution.
AVAudioPlayer has a currentTime property that returns the current time of a track and can also be set.
So I implemented the startSynchronizedPlayback as stated by admsyn and then added the following when I stopped the tracks:
-(void) stopAll
{
int count = [tracksArr count];
for(int i = 0; i < count; i++)
{
trackModel = [tracksArr objectAtIndex:i]
if(i = 0)
{
currentTime = [trackModel currentTime]
}
[trackModel stop]
[trackModel setCurrentTime:currentTime]
}
{
This code basically loops through my array of tracks which each hold their own AVAudioPlayer, grabs the current time from the first track, then sets all of the following tracks to that time. Now when I use the startSynchronizedPlayback method they all play in sync, and pausing unpausing keeps them in sync as well. Hope this is helpful to someone else trying to keep tracks in sync.
If you're issuing individual play messages to each AVAudioPlayer, it is entirely likely that the messages are arriving at different times, or that the AVAudioPlayers finish their warm up phase out of sync with each other. You should be using playAtTime: and the deviceCurrentTime property to achieve proper synchronization. Note the description of deviceCurrentTime:
Use this property to indicate “now” when calling the playAtTime: instance method. By configuring multiple audio players to play at a specified offset from deviceCurrentTime, you can perform precise synchronization—as described in the discussion for that method.
Also note the example code in the playAtTime: discussion:
// Before calling this method, instantiate two AVAudioPlayer objects and
// assign each of them a sound.
- (void) startSynchronizedPlayback {
NSTimeInterval shortStartDelay = 0.01; // seconds
NSTimeInterval now = player.deviceCurrentTime;
[player playAtTime: now + shortStartDelay];
[secondPlayer playAtTime: now + shortStartDelay];
// Here, update state and user interface for each player, as appropriate
}
If you are able to decode the files to disk, then audio units are probably the solution which would provide the best latency. If you decide to use such an architecture, you should also check out Novocaine:
https://github.com/alexbw/novocaine
That framework takes a lot of the headache out of dealing with audio units.

NSUserNotification with custom soundName

Did anyone manage to make a NSUserNotification soundName to work with a custom sound?
I tried with aif and caf format 44100KHz 16bit 2 second of duration. The notification is displayed at the proper time, with the right title and text, but the default sound gets played instead of my custom sound.
The sound files are correctly copied in the application bundle.
If I try this the sounds work ok:
NSSound* sound = [NSSound soundNamed:#"morse.aif"];
[sound play];
But when I use the same sound in my notification, the default notification sound gets played:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
// Insert code here to initialize your application
NSUserNotification* notification = [[NSUserNotification alloc]init];
notification.title = #"Titolo";
notification.deliveryDate = [NSDate dateWithTimeIntervalSinceNow:10];
notification.soundName = #"morse.aif";
[[NSUserNotificationCenter defaultUserNotificationCenter]scheduleNotification:notification];
}
I tried with and without extension, but with no success.
notification.soundName = #"morse.aif";
notification.soundName = #"morse2.caf";
notification.soundName = #"morse";
none of these work.
My application is not signed and not sandboxed, but I don't think that's necessary for user notifications, and apart from the sound problem the notifications work great.
It seems to me like this issue is case-sensitivity. Using notification.soundName = #"Morse"; works perfectly for me. As you can see in the NSSound documentation for soundNamed The search folders for sounds are, in order:
~/Library/Sounds
/Library/Sounds
/Network/Library/Sounds
/System/Library/Sounds
If you look in the last one, which is probably where you're trying to pull from since they're the sounds in System Preferences, you can see their names
So keep the case of the file, and omit the extension and it should work as expected.
If you have multiple versions of your App binary notification center may be searching the wrong binary for your sound files.
If you make sure to delete any old copies of the binary it should fix the issue.
From the Apple Dev Forums: https://devforums.apple.com/message/708511
This can happen if you have multiple version of your app binary floating around. NotificationCenter only fines one of them. If it is the one without the sound then it will not work.

iOS: Multi-Threading issues when loading sounds in the background

I have inherited some code that is in need of a bit of a tidy up. The app has a handful of core sounds and currently, there are lots of AVAudioPlayer instances attached to various ViewController's all playing the same few sounds.
As part of the refactoring exercise, I have decided to implement a singleton class called SoundController. This class will contain one AVAudioPlayer for each sound that needs playing, and instead of each ViewController instantiating their own, they can easily make use of just the one:
[[SoundController controller].majorFunctionSound playAtTime:0];
Another important thing is to ensure that prepareToPlay: has been called on all the sounds prior to them being used to minimise any delays. Since there are only a handful of sounds and they are all likely to be used during any user session, it makes sense to preload all the sounds (on a background thread) when SoundController is first instantiated. In the init method I have something like this:
dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, nil, ^(void)
{
NSURL *soundURL = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/MAJOR FUNCTION.m4a", [[NSBundle mainBundle] resourcePath]]];
_majorAudioSound = [[AVAudioPlayer alloc] initWithContentsOfURL:soundURL error:nil];
[_majorAudioSound prepareToPlay];
});
majorAudioSound is (readonly, strong). #synthesize majorAudioSound = _majorAudioSound.
My concerns are around how poorly (or well) this is going to work in terms of concurrency and what I can do to improve the code. Specifically, if I do this:
[[SoundController controller].majorFunctionSound playAtTime:0];
There is clearly a chance that majorFunctionSound wont be initiated properly depending on whether the background initialisation block has completed yet. Is the worse that could happen that the property returns nil and the sound simply doesn't play?
What other issues might there be? Is there a way to always ensure that the AVAudioPlayer has been properly set up?
First of all, I want you to think a second thought on whether your class have to be a singleton just because you intend to only have one instance of it. It is of course one way to do it, but I think that your initialization issues inherits from the fact that you decided to use a singleton class.
Lets say you have a class called SoundManager and you have made it a Singleton class.
When you ask for the instance of the SoundManager anywhere in your app, you will want to assume that the returned instance is ready to be used immediately. If you inside your SoundManager have an init method that is asynchronous, then you do have a design issue, since you should never have to know when you ask for a Singleton if it is the first time or not it has been initialized.
Since the SoundManager requires initialization I would let my app have an instance of the SoundManager in some kind of base class that takes care of the applications flow, instead of making it a Singleton. Either you could just let your AppDelegate instantiate the one and only SoundManager, or you could have a class called ApplicationController or something where you load all the stuff you need for the app when it initializes. Then you can reach the SoundManager instance via this controller class by passing a reference or by letting your ApplicationController be a singleton. Of course this also works if SoundManager is singleton, as long as you make sure you initialize it on startup, but I prefer to have as few singleton classes as possible.
Now to your question about knowing if your sounds are loaded or not.
I would recommend that you load all sounds before you let the user start using the app. In the meanwhile you could show something to the user, like a loading screen, a progressbar, and play sounds/music if you wish. Here is an example of a structure:
Create a class called SoundManager with a property "loaded" that is false from start
Create a class called ApplicationController that instantiate the SoundManager, and other useful classes you might have, like TextureManager or LocationManager etc.
When the app starts, instantiate ApplicationController, which in turn instantiates SoundManager.
Show a loading screen
Let SoundManager load the "loading sound" first and once it is loaded, start playing it
When the loading of sounds is completed, set the "loaded" property to true
When ApplicationController has loaded everything, let the user start using the app by fading out the loading screen.
If you need the user to start using the app even before the sounds are loaded, then you can still use the same approach by having a property called "loaded". Remember to keep the handling of this property synchronized.

Best way to wait app Launch

I have an app that requests from the server some data on xml type. Ok, it does ok.
But seems this action loads a little the app on launching.
TBXML *tbxml = nil;
tbxml = [[TBXML tbxmlWithURL:[NSURLURLWithString:#"http://www.someplace.com/test/test.xml"]] retain];
So, what i figured is make the app wait to launch completely to do this action. So, i've search for it, and find two ways.
applicationDidFinishLaunching;
awakeFromNib;
I dont know if this is the correct way to do this. So, im open to suggestions.
Thanks!
First, I would suggest you to put that XML loading code in a separate thread using NSOperationQueue or NSThread so that it won't block the main thread.
applicationDidFinishLaunching, IMHO, should be used to initialize your RootViewController, handle incoming push notifications, local notifications, etc. Use viewDidLoad in RootViewController for your purpose.