Does anyone know how to play the system sound effects for (drag) copy, paste and delete operations on OS X? I mean the sound that Finder uses when moving files. I didn't find any API to do that. Does someone maybe know where those sound files are located? I only found the system alert sounds (Frog, Submarine etc..).
SystemSounds are available in
/System/Library/Components/CoreAudio.component/Contents/Resources/SystemSounds/ location.
NSSound *systemSound = [[NSSound alloc] initWithContentsOfFile:#"/System/Library/Components/CoreAudio.component/Contents/Resources/SystemSounds/finder/move\ to\ trash.aif" byReference:YES];
if (systemSound) {
[systemSound play];
}
/System/Library/Components/CoreAudio.component/Contents/Resources/CoreAudioAUUI.bundle/Contents/Resources
Related
Did anyone manage to make a NSUserNotification soundName to work with a custom sound?
I tried with aif and caf format 44100KHz 16bit 2 second of duration. The notification is displayed at the proper time, with the right title and text, but the default sound gets played instead of my custom sound.
The sound files are correctly copied in the application bundle.
If I try this the sounds work ok:
NSSound* sound = [NSSound soundNamed:#"morse.aif"];
[sound play];
But when I use the same sound in my notification, the default notification sound gets played:
- (void)applicationDidFinishLaunching:(NSNotification *)aNotification
{
// Insert code here to initialize your application
NSUserNotification* notification = [[NSUserNotification alloc]init];
notification.title = #"Titolo";
notification.deliveryDate = [NSDate dateWithTimeIntervalSinceNow:10];
notification.soundName = #"morse.aif";
[[NSUserNotificationCenter defaultUserNotificationCenter]scheduleNotification:notification];
}
I tried with and without extension, but with no success.
notification.soundName = #"morse.aif";
notification.soundName = #"morse2.caf";
notification.soundName = #"morse";
none of these work.
My application is not signed and not sandboxed, but I don't think that's necessary for user notifications, and apart from the sound problem the notifications work great.
It seems to me like this issue is case-sensitivity. Using notification.soundName = #"Morse"; works perfectly for me. As you can see in the NSSound documentation for soundNamed The search folders for sounds are, in order:
~/Library/Sounds
/Library/Sounds
/Network/Library/Sounds
/System/Library/Sounds
If you look in the last one, which is probably where you're trying to pull from since they're the sounds in System Preferences, you can see their names
So keep the case of the file, and omit the extension and it should work as expected.
If you have multiple versions of your App binary notification center may be searching the wrong binary for your sound files.
If you make sure to delete any old copies of the binary it should fix the issue.
From the Apple Dev Forums: https://devforums.apple.com/message/708511
This can happen if you have multiple version of your app binary floating around. NotificationCenter only fines one of them. If it is the one without the sound then it will not work.
I'm working on a Mac OS X application using Cocoa in Xcode. One feature involves a simultaneous audio playback:
I want to build some kind of audio stack: An audio file merged in runtime from a set of different source files. The set of source audio files differs each runtime. Each Audio file has the exact same length.
After creating the audio stack (or merged file) I want to play [and store] it.
I'm new to audio frameworks in Cocoa. Is there a high-level API that provides an appropriate functionality? Do I have to look inside the CoreAudio, Audio Unit or QTKit framework? Do you have an implementation idea (or sample implementation)?
If you just want to play a bunch of audio files simultaneously, this is easy; unlikes Windows, OS X acts as if it has an infinite-channel mixer built-in, and all the higher-level functions just grab a new channel if necessary for each sound. For example:
NSURL *u1 = [[NSBundle mainBundle] URLForResource:#"1" withExtension:#"mp3"];
NSURL *u2 = [[NSBundle mainBundle] URLForResource:#"2" withExtension:#"mp3"];
NSSound *s1 = [[NSSound alloc] initWithContentsOfURL:u1 byReference:YES];
NSSound *s2 = [[NSSound alloc] initWithContentsOfURL:u2 byReference:YES];
[s1 play];
[s2 play];
This will start playing MyApp.app/Contents/Resources/1.mp3 and MyApp.app/Contents/Resources/2.mp3 at (very close to) the same time.
If you need a (real or virtual) file with a bunch of audio tracks in them, QTKit is probably the easiest way; create a movie, open all of your audio files, copy their tracks into the movie (with the same start date), and now you can play the movie (or do whatever else you want).
If you want to actually merge the audio into one stereo track (e.g., so you can save a normal audio file), instead of just playing the tracks together at runtime, you could use QTKit as above and create an export session (much like using the "Export…" feature in QuickTime Player), or use CoreAudio, or a variety of different cross-platform open source libraries.
Here is some background information, otherwise skip ahead to the question in bold. I am building an app and I would like it to have access to the remote control/lock screen events. The tricky part is that this app does not play audio itself, it controls the audio of another device nearby. The communication between devices is not a problem when the app is in the foreground. As I just found out, an app does not assume control of the remote controls until it has played audio with a playback audio session, and was the last do so. This presents a problem because like I said, the app controls ANOTHER device's audio and has no need to play its own.
My first inclination is to have the app play a silent clip every time it is opened in order to assume control of the remote controls. The fact that I have to do this makes me wonder if I am even going to be allowed to do it by Apple or if there is another way to achieve this without fooling the system with fake audio clips.
QUESTION(S): Will Apple approve an app that plays a silent audio clip in order to assume control of the remote/lock screen controls for the purpose of controlling another device's audio? Is there any way of assuming control of the remote controls without an audio session?
P.S. I would prefer to have this functionality on iOS 4.0 and up.
P.P.S I have seen this similar question and it has gotten me brainstorming but the answer provided is not specific to what I need to know.
NOTE: As of iOS 7.1, you should be using MPRemoteCommandCenter instead of the answer below.
You create various system-provided subclasses of MPRemoteCommand and assign them to properties of the [MPRemoteCommandCenter sharedCommandCenter].
I'm keeping the rest of this around for historical reference, but the following is not guaranteed to work on recent iOS versions. In fact, it just might not.
You definitely do need an audio player but not necessarily an explicit session to take control of the remote control events. (AVAudioSession is implicit to any app that plays audio.) I spent a decent amount of time playing with this to confirm this.
I've seen a lot of confusion on the internet about where to set up the removeControlEventRecievedWithEvent: method and various approaches to the responder chain. I know this method works on iOS 6 and iOS 7. Other methods have not. Don't waste your time handling remote control events in the app delegate (where they used to work) or in a view controller which may go away during the lifecycle of your app.
I made a demo project to show how to do this.
Here's a quick rundown of what has to happen:
You need to create a subclass of UIApplication. When the documentation says UIResponder, it means UIApplication, since your application class is a subclass of UIResponder. In this subclass, you're going to implement the remoteControlReceivedWithEvent: and canBecomeFirstResponder methods. You want to return YES from canBecomeFirstResponder. In the remote control method, you'll probably want to notify your audio player that something's changed.
You need to tell iOS to use your custom class to run the app, instead of the default UIApplication. To do so, open main.m and change this:
return UIApplicationMain(argc, argv, nil, NSStringFromClass([RCAppDel`egate class]));
to look like this:
return UIApplicationMain(argc, argv, NSStringFromClass([RCApplication class]), NSStringFromClass([RCAppDelegate class]));
In my case RCApplication is the name of my custom class. Use the name of your subclass instead. Don't forget to #import the appropriate header.
OPTIONAL: You should configure an audio session. It's not required, but if you don't, audio won't play if the phone is muted. I do this in the demo app's delegate, but do so where appropriate.
Play something. Until you do, the remote controls will ignore your app. I just took an AVPlayer and gave it the URL of a streaming site that I expect to be up. If you find that it fails, put your own URL in there and play with it to your heart's content.
This example has a little bit more code in there to log out remote events, but it's not all that complicated. I just define and pass around some string constants.
I bet that a silent looping MP3 file would help work towards your goal.
Moshe's solution worked great for me! However one issue I noticed is when you paused the audio, the media controls would go away and you won't be able to play it again without going back into the app. If you set the Media Info on the lock screen when you play the audio then this won't happen:
NSDictionary *mediaInfo = #{MPMediaItemPropertyTitle: #"My Title",
MPMediaItemPropertyAlbumTitle: #"My Album Name",
MPMediaItemPropertyPlaybackDuration: [NSNumber numberWithFloat:0.30f]};
[[MPNowPlayingInfoCenter defaultCenter] setNowPlayingInfo:mediaInfo];
I am using the iPad settings app to change some button sounds and a background image. It all works well and the settings are maintained from one app launch to another in the simulator. Now I have implemented a toggle switch to either set sets of sounds off or on. When the app launches, whatever state the switch is in, it works; e.g. if the "Alert Sounds" switch is OFF the alert sounds are silent and if I change it to ON the sounds will start working. However, if I turn the switch back OFF the sounds still keep working. However, if the state is ON when the app launches, the sounds work, but will not be silenced when the switch is set to OFF.
Note that this is different than the settings not taking effect until a second round of settings. That was a previous problem I solved (thanks to stack overflow) by using:
- (void)applicationDidBecomeActive:(UIApplication *)application
{
[[NSUserDefaults standardUserDefaults] synchronize];
}
I have methods named:
- (void)defaultsChanged:(NSNotification *)NSUserDefaultsDidChangeNotification
(which is called when the notification is sent)
and
-(void)setValuesFromPreferences
(which is called in ViewDidLoad)
The logic looks like this in both:
// Set alert sounds from preferences
NSString *alertSoundPreference = [userDefaults stringForKey:kAlertSound];
BOOL alertSoundEnabled = [userDefaults boolForKey:kAlertSoundEnabled];
if (alertSoundEnabled)
{
// Create the URLs for the alert audio files
// Store the alert sound URLs as a CFURLRef instances
// Create system sound objects representing the alert sound files
}
I do not have an else, because I assume that no sound resources will be specified if alertSoundEnabled is NO.
I have searched for explanations and tutorials that mention this problem but have not found any yet, so I'm asking here. Thanks for any suggestions.
viewDidLoad is not necessarily called when the app becomes active again (nor does viewWill/DidAppear, IIRC), as the whole point of iOS 4+ multitasking is to prevent such loading/unloading and recreation of objects on app-switching.
If I had to guess, the sounds are already allocated when the user had the switch ON at original launch/viewDidLoad; however, if your code does nothing to explicitly disassociate them when it loads back up, they would continue playing, as they are all already set up.
As such, I'd try adding an else clause that (upon alertSoundEnabled == NO) destroys your system sound objects.
I have a custom video player set up with custom controls, and I utilize MPVolumeView to provide an airplay button. When a user chooses to use AirPlay, they interact with that Apple UI and there is no event (that I can find) that says "hey, the video is now playing over AirPlay".
The problem is that, if I close the player and reopen it, it loads the movie (load state changes to MPMovieLoadStatePlayable), I play it, and I immediately get a playback did finish notification with reason being MPMovieFinishReasonPlaybackEnded, and the video continues to try to play through AirPlay. I'm certain the movie stops and is deallocated whenever I close the player.
If anyone has any advice on how to handle this, knows some events to listen for, or has any ideas about this whatsoever, please let me know. Thanks!
The answer here turns out to be that, at least up to 4.3, there is no way to get an answer to this through code.
The problem in this case is how you dispose of the MPMoviePlayerController when you're finished with it. Even if the video plays through, before you finally release it, you have to call pause and then stop. Like this:
MPMoviePlayerController *mp = [[MPMoviePlayerController alloc] init];
// use the player. then when done with it:
[mp pause];
[mp stop];
[mp release];
If you don't do this then the next time you create a MPMoviePlayerController, certain properties are somehow ghosted in the framework. Playing a video progressively caused audio from the previous mp to play while the new mp did its initial buffering. Also, if the previous video was playing over airplay, the next video would get a notification that the video finished right after it starts and some other weirdness.
Long story short, dispose of your video players with the above sequence to avoid issues with later movie players.