I was wondering if there was a way to retrieve users ringtones or sounds. I am not interested in their music playlist but like 30 second clips of audio that could work for uilocalnotifications. Does anyone know how to do this? Thanks!
See this answer.
how to get the system ringtones programmingly in ios?
If you still want to fire an audio notification, this article should get you started.
It seems you can assign a filename to the soundName property, or pass UILocalNotificationDefaultSoundName for (what I would guess is) the user's selected notification sound.
- (void)scheduleNotificationWithItem:(ToDoItem *)item interval:(int)minutesBefore {
UILocalNotification *localNotif = [[UILocalNotification alloc] init];
localNotif.soundName = UILocalNotificationDefaultSoundName;
...
[localNotif release];
}
Related
I've been trying to work out this problem for a good 48 hours now and haven't come up with anything. I have 2 AVPlayer objects playing different http live streams. Obviously, I don't want them both playing audio at the same time so I need a way to mute one of the videos.
Apple suggests this for muting an audio track playing in AVPlayer...
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVPlayerItemTrack *track in [_playerItem tracks]) {
if ([track.assetTrack.mediaType isEqualToString:AVMediaTypeAudio]) {
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:0.0 atTime:CMTimeMakeWithSeconds(0,1)];
[audioInputParams setTrackID:[track.assetTrack trackID]];
[allAudioParams addObject:audioInputParams];
// Added to what Apple Suggested
[track setEnabled:NO];
}
}
AVMutableAudioMix *audioZeroMix = [AVMutableAudioMix audioMix];
[audioZeroMix setInputParameters:allAudioParams];
[_playerItem setAudioMix:audioZeroMix];
When this didn't work (after many iterations), I found the enabled property of AVPlayerItemTrack and tried setting that to NO. Also nothing. This doesn't even register as doing anything because when I try an NSLog(#"%x",track.enabled), it still shows up as 1.
I'm at a loss and I can't think of another piece of documentation I can read and re-read to get a good answer. If anyone out there can help, that would be fantastic.
*Update: I got a hold of Apple and according to the AVFoundation team, it is impossible to mute or disable a track of an HLS video. I, personally, feel like this is a bug so I submitted a bug report (You should do the same to tell Apple that this is a problem). You can also
try and submit a feature enhancement request via their feedback page.
New iOS 7 answer: AVPlayer now has 2 new properties 'volume' and 'muted'. Use those!
And here is the original answer for life before iOS 7:
I've been dealing with the same thing. We created muted streams and streams with audio. To mute or unmute you call [player replaceCurrentItemWithPlayerItem:muteStream].
I also submitted a bug report. It looks like AVPlayer has this functionality on MacOS 10.7, but it hasn't made it to iOS yet.
AVAudioMix is documented not to work on URL assets here
Of course I tried it anyway, and like you I found it really doesn't work.
The best solution for this would be to actually embed the stream url feed with two audio tracks! One would be with the normal audio and the other audio track would be the muted audio.
It makes more sense to do it this way rather then the way ComPuff suggested as his way your actually creating two separate URL streams - which is not required.
Here is the code that you could use to switch the audio tracks:
float volume = 0.0f;
AVPlayerItem *currentItem = self.player.currentItem;
NSArray *audioTracks = self.player.currentItem.tracks;
DLog(#"%#",currentItem.tracks);
NSMutableArray *allAudioParams = [NSMutableArray array];
for (AVPlayerItemTrack *track in audioTracks)
{
if ([track.assetTrack.mediaType isEqual:AVMediaTypeAudio])
{
AVMutableAudioMixInputParameters *audioInputParams = [AVMutableAudioMixInputParameters audioMixInputParameters];
[audioInputParams setVolume:volume atTime:kCMTimeZero];
[audioInputParams setTrackID:[track.assetTrack trackID]];
[allAudioParams addObject:audioInputParams];
}
}
if ([allAudioParams count] > 0) {
AVMutableAudioMix *audioMix = [AVMutableAudioMix audioMix];
[audioMix setInputParameters:allAudioParams];
[currentItem setAudioMix:audioMix];
}
The only problem is that my stream url is only display two tracks (one for video and one for audio) when it should actually be three tracks (2 audio tracks). I cant work out if this is a problem with the stream url or my code! Can anyone spot any mistakes in the code?
When application is in minimize state and at the same time notifications come then badges should be see on app icon.
If a notification arrives and your application is not is foreground, the OS handles the notification.
Your notification can have a field badge that will make the OS update the badge. However, this means that the server than sends the notification must have a way of knowing which number should be the badge.
The notification body would look like this:
{
"aps" : {
"badge" : 9
}
}
Heres my solution. I needed to achieve the same thing in an app that I did. I had a queue of Downloads and wanted to use a Badge to show how many downloads were left, and keep updating it even while in the background. Basically my solution was, every time one of the downloads were completed I setup a UILocalNotification with a silent sound, and no message text. Simply just set the badge..As seen below..
- (void)queRequestFinished:(ASIHTTPRequest *)request {
self.inResourceCount -= 1; // Deducted 1 from the total count of downloads in queue.
Class cls = NSClassFromString(#"UILocalNotification");
if (cls != nil) {
UILocalNotification *notif = [[cls alloc] init];
notif.fireDate = [NSDate date]; // Schedule notification for now.
notif.timeZone = [NSTimeZone defaultTimeZone];
notif.soundName = #"silence.caf";
notif.applicationIconBadgeNumber = inResourceCount; // Number you want displayed as Badge.
// This is where the magic happens, and actually changes your badge.
[[UIApplication sharedApplication] scheduleLocalNotification:notif];
[notif release];
}
}
Id like to point out, that my scenrio may be different from yours. I was using ASIHTTPRequest library, which has support for continuing downloads while backgrounded, and the method above queRequestFinished: is called even while in the background. Hope this helps, if it does mark it as the answer :) .
I am trying to set custom local notification sound like this:
UILocalNotification *notification = [UILocalNotification new];
notification.timeZone = [NSTimeZone systemTimeZone];
notification.fireDate = date;
notification.alertAction = #"123";
notification.alertBody = #"123";
//!!!
notification.soundName = #"Glass.aiff";
alarmID=[NSString stringWithFormat:#"%i", arrayAlarms.count];
NSDictionary *infoDict = [NSDictionary dictionaryWithObject:alarmID forKey:#"id"];
notification.userInfo = infoDict;
notification.repeatInterval=NSWeekCalendarUnit;
[[UIApplication sharedApplication] scheduleLocalNotification:notification];
But I hear only default sound. I am testing app on ios5. Great thanks for help in advance and sorry for my english.
Custom Notification sounds are restricted to less than 30 seconds. If your sound is longer than this iOS will play the default sound instead. Could this be your problem?
For more info see here. https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/RemoteNotificationsPG/Chapters/IPhoneOSClientImp.html
"Custom sounds must be under 30 seconds when played. If a custom sound is over that limit, the default system sound is played instead."
Check your syntax , its wrong. check with this,
notification.soundName = "Glass.aiff";
One can still use UILocalNotificationDefaultSoundName for default local notification sound.
notification.soundName = #"Glass.aiff";
That should work. Make sure the sound(Glass.aiff) is actually in your app’s bundle, and is under 30 seconds.We can't set a custom sound that not present in our app's bundle ie from Document folder etc.
Are there any delegate methods in AVPlayer class? I need to handle interruptions such as phone call etc. AVAudioPlayer supports. If AVPlayer doesn't support it, how to stream audio with AVAudioPlayer?
AVPlayer doesn't have the methods you want but you can use AVAudioSession object instead
1) Select AVAudioSession object (for example [AVAudioSession sharedInstance])
2) Set it active by calling setActive:error: method
3) Set its delegate (class implementing AVAudioSessionDelegate protocol)
4) Implement delegate's methods such as
-(void)beginInterruption;
-(void)endInterruptionWithFlags:(NSUInteger)flags;
-(void)endInterruption;
EDIT
I don't see any delegates available in AVPlayer class
So how to stream audio with AVAudioPlayer? Because we don't know how you need to stream it, and most important from where, providind some inspiration
see related questions:
stopping an AVAudioPlayer
Reusing an AVAudioPlayer for a different sound
avaudioplayer playingsong
Streaming with an AVAudioplayer
http://blog.guvenergokce.com/avaudioplayer-on-iphone-simulator/57/
http://www.iphonedevsdk.com/forum/iphone-sdk-development/15991-sample-code-avaudioplayer.html
and tutorial
http://mobileorchard.com/easy-audio-playback-with-avaudioplayer/
AVAudioPlayerDelegate Protocol Reference http://developer.apple.com/library/ios/#documentation/AVFoundation/Reference/AVAudioPlayerDelegateProtocolReference/Reference/Reference.html#//apple_ref/doc/uid/TP40008068
Responding to Sound Playback Completion
– audioPlayerDidFinishPlaying:successfully:
Responding to an Audio Decoding Error
– audioPlayerDecodeErrorDidOccur:error:
Handling Audio Interruptions
– audioPlayerBeginInterruption:
– audioPlayerEndInterruption:
– audioPlayerEndInterruption:withFlags:
I don't think AVPlayer will get you there. Take a look at AVAudioPlayerDelegate, The audioPlayerBeginInterruption would be the delegate method you are looking for.
Here's a sample of code I use for AVAudioPlayer (I'm assuming you already know how to build your url):
// Instantiates the AVAudioPlayer object, initializing it with the sound
NSError * errAV = nil;
AVAudioPlayer *newPlayer = [[AVAudioPlayer alloc] initWithContentsOfUrl: mUrl error: &errAV];
if (newPlayer == nil) {
NSString * msg = [[NSString alloc] initWithFormat:#"An internal error has occured: %#", [errAV localizedDescription]];
UIAlertView *uiav = [[UIAlertView alloc] initWithTitle:#"Play Sound"
message:msg delegate:nil cancelButtonTitle:#"OK" otherButtonTitles:nil];
[uiav show];
[uiav release];
[msg release];
} else {
self.appSoundPlayer = newPlayer;
[newPlayer release];
// "Preparing to play" attaches to the audio hardware and ensures that playback
// starts quickly when the user taps Play
[appSoundPlayer prepareToPlay];
[appSoundPlayer setVolume: 1.0];
[appSoundPlayer setDelegate: self];
[appSoundPlayer play];
}
Even when using AVAudioPlayer, you can initialize an Audio Session, where in you can specify the kind of playback (or recording, for that matter) you will be doing, and a callback for handling interruptions like phone calls.
Have a look at AudioSessionInitialize() and it's third parameter, a callback function for handling interruptions. In your callback, you can handle both the start and end of an interruption.
The salient different here, between using an AudioSession and relying on the AVAudioPlayer callbacks, is that the former occurs at a lower level, perhaps before the latter's delegate methods are called. So with the AudioSession callback, you have finer control, I think, but then you have to do more, perhaps, depending on the complexity of your app's audio setup.
It has been a long while since the question was posted. However, for the sake of completion, I would like to add: AVPlayer can be used to handle interruptions by adding a TimeObserver as follows:
When initialising the AVPlayer:
AVPlayer *_aplayer;
id _aplayerObserver;
_aplayer = [[AVPlayer alloc] initWithURL:mediaURL];
_aplayerObserver = [_aplayer addPeriodicTimeObserverForInterval:CMTimeMake(1.0, 1.0) queue:NULL usingBlock:^(CMTime time)
{
if (((time.value/time.timescale) >= (_aplayer.currentItem.asset.duration.value/_aplayer.currentItem.asset.duration.timescale))
{
// media file played to its end
// you can add here code that should run after the media file is completed,
// thus mimicing AVAudioPlayer's audioPlayerDidFinishPlaying event
}
else
{
if (_aplayer.rate == 0)
// audio player was interrupted
}
}
If you choose this solution, please take note of what addPeriodicTimeObserverForInterval's documentation says:
You must retain the returned value [i.e. _aplayerObserver] as long as you want the time observer to be invoked by the player. Each invocation of this method should be paired with a corresponding call to removeTimeObserver:.
What I want to do is I have many classes first of all, they all have the same music throughout, as i said in the app delegates bool application did finish launching method. But in my last 3 classes, I want different music, fair enough, I put these lines of code:
[(Smart2AppDelegate *)[UIApplication sharedApplication].delegate pauseAudioPlayer];
[(Smart2AppDelegate *)[UIApplication sharedApplication].delegate newAudioPlayer];
And in my app delegate:
-(void)newAudioPlayer {
NSString *music = [[NSBundle mainBundle]
pathForResource:#"win" ofType:#"m4a"];
audio.delegate = self;
self.audio = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:music] error:NULL];
[audio play];
audio.numberOfLoops = -1;
}
-(void)pauseAudioPlayer {
[audio pause];
}
So it works, whenever I go to that view, it changes music, lets call that view, view x. Now, from that view x I can go to and from only to 2 other views, e.g. I can go to the info page and there is a back button that leads back to that view x, and the same with a prize page. But when I go back to the view x, the music starts from the beginning, when in these three classes, i want them to all loop and not go from beginning because it sounds akward. The reason is simple it is because I put it in the viewDidLoad. But how can I do this, I was thinking of a way to actually group classes and put in the avaudioplayer method in there.
Here you have a possible approach:
Refactor out the music+sounds mechanisms into a separate class, i.e. something like Smart2AudioPlayer.
Public the necessary methods: play, pause, resume so you can use the audio player from anywhere.
In each viewDidLoad method, call the audio player and pass along a parameter to indicate who is the sender (who wants the sound to be played) and a preference indicating whether you wish to continue playing the current group song (see bellow) or start it all over again.
Implement the necessary logic in your audio player to allow certain groups of classes to be associated together. This way, when you are playing a song from a class that belongs to a group, and another class of the same group asks for the music to be played, you won't start the song again, you simply do nothing.
Hope this helps