iOS , read title music ipod when the music is change - nsnotificationcenter

i want to get title and artist name from ipod music currently music playing
but i want when the music is changed i can detected and the UIlabel are change
here is the code :
- (IBAction)getTitlemusic
{
// MPMediaItem * currentItem = self.musicPlayer.nowPlayingItem;
MPMediaItem * song = [[MPMusicPlayerController iPodMusicPlayer] nowPlayingItem];
NSString * album = [song valueForProperty:MPMediaItemPropertyAlbumTitle];
NSString * title = [song valueForProperty:MPMediaItemPropertyTitle];
NSString * artist = [song valueForProperty:MPMediaItemPropertyArtist];
NSLog(#"%#,%#,%#", album,title,artist);
songlabel.text = [NSString stringWithFormat:#"%#",album ];
arttistlabel.text = [NSString stringWithFormat:#"%#\r%#",title,artist];
[self performSelector:#selector(getTitlemusic) withObject:nil afterDelay:10.0f];
}
i know the timer and delay isn't very good idea

MPMusicPlayerController *player = [MPMusicPlayerController iPodMusicPlayer];
[[NSNotificationCenter defaultCenter] addObserver:self
selector:#selector(getTitlemusic)
name:MPMusicPlayerControllerNowPlayingItemDidChangeNotification
object:player];
[player beginGeneratingPlaybackNotifications];
using this code

Related

AVAudioPCMBuffer for music files

I've been trying to play music in my SpriteKit game and used the AVAudioPlayerNode class to do so via AVAudioPCMBuffers. Every time I exported my OS X project, it would crash and give me an error regarding audio playback. After banging my head against the wall for the last 24 hours I decided to re-watch WWDC session 501 (see 54:17). My solution to this problem was what the presenter used, which is to break the frames of the buffer into smaller pieces to break up the audio file being read.
NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;
AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
AVAudioFramePosition readPosition = audioFile.framePosition;
if (![audioFile readIntoBuffer: readBuffer error: &error])
return NO;
if (readBuffer.frameLength == 0) //end of file reached
break;
}
My current problem is that the player only plays the last frame read into the buffer. The music that I'm playing is only 2 minutes long. Apparently, this is too long to just read into the buffer outright. Is the buffer being overwritten every time the readIntoBuffer: method is called inside the loop? I'm such a noob at this stuff...how can I get the entire file played?
If I can't get this to work, what is a good way to play music (2 different files) across multiple SKScenes?
This is the solution that I came up with. It's still not perfect, but hopefully it will help someone who is in the same predicament that I've found myself in. I created a singleton class to handle this job. One improvement that can be made in the future is to only load sound effects and music files needed for a particular SKScene at the time they are needed. I had so many issues with this code that I don't want to mess with it now. Currently, I don't have too many sounds, so it's not using an excessive amount of memory.
Overview
My strategy was the following:
Store the audio file names for the game in a plist
Read from that plist and create two dictionaries (one for music and one for short sound effects)
The sound effect dictionary is composed of a AVAudioPCMBuffer and a AVAudioPlayerNode for each of the sounds
The music dictionary is compose of an array of AVAudioPCMBuffers, an array of timestamps for when those buffers should be played in queue, a AVAudioPlayerNode and the sample rate of the original audio file
The sample rate is necessary for figuring out the time at which each buffer should be played (you'll see the calculations done in code)
Create an AVAudioEngine and get the main mixer from the engine and attach all AVAudioPlayerNodes to the mixer (as per usual)
Play sound effects or music using their various methods
sound effect playing is straightforward...call method -(void) playSfxFile:(NSString*)file;
and it plays a sound
for music, I just couldn't find a good solution without invoking the help of the scene trying to play the music. The scene will call -(void) playMusicFile:(NSString*)file;and it will schedule the buffers to play in order that they were created. I couldn't find a good way to get the music to repeat once completed within my AudioEngine class so I decided to get the scene to check in its update: method whether or not the music was playing for a particular file and if not, play it again (not a very slick solution, but it works)
AudioEngine.h
#import <Foundation/Foundation.h>
#interface AudioEngine : NSObject
+(instancetype)sharedData;
-(void) playSfxFile:(NSString*)file;
-(void) playMusicFile:(NSString*)file;
-(void) pauseMusic:(NSString*)file;
-(void) unpauseMusic:(NSString*)file;
-(void) stopMusicFile:(NSString*)file;
-(void) setVolumePercentages;
-(bool) isPlayingMusic:(NSString*)file;
#end
AudioEngine.m
#import "AudioEngine.h"
#import <AVFoundation/AVFoundation.h>
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)
#interface AudioEngine()
#property AVAudioEngine *engine;
#property AVAudioMixerNode *mixer;
#property NSMutableDictionary *musicDict;
#property NSMutableDictionary *sfxDict;
#property NSString *audioInfoPList;
#property float musicVolumePercent;
#property float sfxVolumePercent;
#property float fadeVolume;
#property float timerCount;
#end
#implementation AudioEngine
int const FADE_ITERATIONS = 10;
static NSString * const MUSIC_PLAYER = #"player";
static NSString * const MUSIC_BUFFERS = #"buffers";
static NSString * const MUSIC_FRAME_POSITIONS = #"framePositions";
static NSString * const MUSIC_SAMPLE_RATE = #"sampleRate";
static NSString * const SFX_BUFFER = #"buffer";
static NSString * const SFX_PLAYER = #"player";
+(instancetype) sharedData {
static AudioEngine *sharedInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
sharedInstance = [[self alloc] init];
[sharedInstance startEngine];
});
return sharedInstance;
}
-(instancetype) init {
if (self = [super init]) {
_engine = [[AVAudioEngine alloc] init];
_mixer = [_engine mainMixerNode];
_audioInfoPList = [[NSBundle mainBundle] pathForResource:#"AudioInfo" ofType:#"plist"]; //open a plist called AudioInfo.plist
[self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
[self initMusic];
[self initSfx];
}
return self;
}
//opens all music files, creates multiple buffers depending on the length of the file and a player
-(void) initMusic {
_musicDict = [NSMutableDictionary dictionary];
_audioInfoPList = [[NSBundle mainBundle] pathForResource: #"AudioInfo" ofType: #"plist"];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *musicFileName in audioInfoData[#"music"]) {
[self loadMusicIntoBuffer:musicFileName];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
[_engine connect:player to:_mixer format:buffer.format];
[_musicDict[musicFileName] setObject:player forKey:#"player"];
}
}
//opens a music file and creates an array of buffers
-(void) loadMusicIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:#"aif"];
//NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"aif"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
[_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];
NSMutableArray *buffers = [NSMutableArray array];
NSMutableArray *framePositions = [NSMutableArray array];
const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
[framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
if (![audioFile readIntoBuffer:readBuffer error:&error]) {
NSLog(#"failed to read audio file: %#", error);
return;
}
if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
break;
}
[buffers addObject:readBuffer];
}
[_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
[_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
}
-(void) initSfx {
_sfxDict = [NSMutableDictionary dictionary];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *sfxFileName in audioInfoData[#"sfx"]) {
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
[self loadSoundIntoBuffer:sfxFileName];
AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
[_engine connect:player to:_mixer format:buffer.format];
[_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
}
}
//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
-(void) loadSoundIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"mp3"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
[audioFile readIntoBuffer:readBuffer error:&error];
[_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
}
-(void)startEngine {
[_engine startAndReturnError:nil];
}
-(void) playSfxFile:(NSString*)file {
AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:#"player"];
AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
[player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
[player setVolume:1.0];
[player setVolume:_sfxVolumePercent];
[player play];
}
-(void) playMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying] == NO) {
NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];
double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];
for (int i = 0; i < [buffers count]; i++) {
long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];
AVAudioPCMBuffer *buffer = [buffers objectAtIndex:i];
[player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
if (i == [buffers count] - 1) {
[player stop];
}
}];
[player setVolume:_musicVolumePercent];
[player play];
}
}
}
-(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
if ([file isEqualToString:#"menuscenemusic"]) {
AVAudioPlayerNode *player = [_musicDict[#"levelscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
else {
AVAudioPlayerNode *player = [_musicDict[#"menuscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
}
//stops the player for a particular sound
-(void) stopMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
_timerCount = FADE_ITERATIONS;
_fadeVolume = _musicVolumePercent;
[self fadeOutMusicForPlayer:player]; //fade out the music
}
}
//helper method for stopMusicFile:
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:#selector(handleTimer:) userInfo:player repeats:YES];
}
//helper method for stopMusicFile:
-(void) handleTimer:(NSTimer*)timer {
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
if (_timerCount > 0) {
_timerCount--;
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
_fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
[player setVolume:_fadeVolume];
}
else {
[player stop];
[player setVolume:_musicVolumePercent];
[timer invalidate];
}
}
-(void) pauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
[player pause];
}
}
-(void) unpauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player play];
}
//sets the volume of the player based on user preferences in GameData class
-(void) setVolumePercentages {
NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:#"musicVolume"];
_musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:#"sfxVolume"];
_sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
//immediately sets music to new volume
for (NSString *file in [_musicDict allKeys]) {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player setVolume:_musicVolumePercent];
}
}
-(bool) isPlayingMusic:(NSString *)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying])
return YES;
return NO;
}
#end

AVQueuePlayer with SWRevealViewController in iOS

I am using SWRevealViewController for left side menu in my application.I am using objcetive c. 5 menu option in SWRevealViewController. I am using AVQueuePlayer in home menu(first view).
My problem is when I am clicking another option in menu bar and navigating on another view at that time AVQueuePlayer stop playing it self. How can I fix this?
Note: AVQueuePlayer is Properly working while application is in background with home menu.
NSError *setCategoryErr = nil;
NSError *activationErr = nil;
[[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback error:&setCategoryErr];
[[AVAudioSession sharedInstance] setActive:YES error:&activationErr];
NSError *sessionError = nil;
[[AVAudioSession sharedInstance] setDelegate:self];
[[AVAudioSession sharedInstance]setCategory:AVAudioSessionCategoryPlayback error:NULL];
// Change the default output audio route
UInt32 doChangeDefaultRoute = 1;
AudioSessionSetProperty(kAudioSessionProperty_OverrideCategoryDefaultToSpeaker, sizeof(doChangeDefaultRoute), &doChangeDefaultRoute);
NSArray *queue = #[
[AVPlayerItem playerItemWithURL:[NSURL URLWithString:chanel_url_str]],
];
self.player1 = [[AVQueuePlayer alloc] initWithItems:queue];
self.player1.actionAtItemEnd = AVPlayerActionAtItemEndAdvance;
[self.player1 addObserver:self
forKeyPath:#"currentItem"
options:NSKeyValueObservingOptionNew
context:nil];
void (^observerBlock)(CMTime time) = ^(CMTime time) {
NSString *timeString = [NSString stringWithFormat:#"%02.2f", (float)time.value / (float)time.timescale];
if ([[UIApplication sharedApplication] applicationState] == UIApplicationStateActive) {
} else {
NSLog(#"App is backgrounded. Time is: %#", timeString);
}
};
self.timeObserver = [self.player1 addPeriodicTimeObserverForInterval:CMTimeMake(10, 1000)
queue:dispatch_get_main_queue()
usingBlock:observerBlock];
[self.player1 play];
Here chanel_url_str is NSString with URL format.

How to get nowplaying infomation in Thirdparty music application?

I want to get nowplaying infomation.
So, following this code:
NSDictionary *info = [[MPNowPlayingInfoCenter defaultCenter] nowPlayingInfo];
NSString *title = [info valueForKey:MPMediaItemPropertyTitle];
NSLog(#"%#",title);
MPMusicPlayerController *pc = [MPMusicPlayerController iPodMusicPlayer];
MPMediaItem *playingItem = [pc nowPlayingItem];
if (playingItem) {
NSInteger mediaType = [[playingItem valueForProperty:MPMediaItemPropertyMediaType] integerValue];
if (mediaType == MPMediaTypeMusic) {
NSString *songTitle = [playingItem valueForProperty:MPMediaItemPropertyTitle];
NSString *albumTitle = [playingItem valueForProperty:MPMediaItemPropertyAlbumTitle];
NSString *artist = [playingItem valueForProperty:MPMediaItemPropertyArtist];
NSString *genre = [playingItem valueForProperty:MPMediaItemPropertyGenre];
TweetTextField.text = [NSString stringWithFormat:#"#nowplaying %# - %# / %# #%#", artist, songTitle, albumTitle,genre];
MPMediaItemArtwork *artwork = [playingItem valueForProperty:MPMediaItemPropertyArtwork];
CGSize newSize = CGSizeMake(250, 250);
UIGraphicsBeginImageContext(newSize);
[[artwork imageWithSize:CGSizeMake(100.0, 100.0)] drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage* newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
_imageView.image = newImage;
}
if (_imageView.image == nil){
} else {
_tableView.alpha=0.5;
}
}
But this code can get nowplaying infomation from Defautl iPod Application.
How to get nowplaying infomation in Thirdparty music application?
(e.g.: Mobile Safari, Youtube App, gMusic, Melodies etc).
I think this isn't possible. The documentation states that MPNowPlayingInfoCenter is only for setting information on the lock screen.
Here is a related question.

Playing two videos in sequence withou any gap or pause between them in iOS 5

In my app, I should play firstvideo at once, and then play second video, which will on repeat. So, at first I used MPMoviePlayerController, but there already was a black screen between two videos. So, I read somewhere, that in my case I should use AVFoundation. That's whe I use AVQueuePLayer with two AVPLayerItem, which I create with my NSURL. But issue is, that after first video finished, there is a long pause. Nearly for 0.5 sec, before second vide o starts. How remove it? What way to use? So, I really appreciate any help! I need it very much, as soon as possible. Thanks!
Here all my methods which I use:
- (void) configureAVPlayer {
self.queuePlayer.actionAtItemEnd = AVPlayerActionAtItemEndPause;
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:#selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:self.queuePlayer];
self.isFirstVideoFinished = NO;
[self playVideos];
}
- (void) playVideos {
[self setPlayerItemsForQueue];
[self.videoHolder setPlayer:self.queuePlayer];
}
- (void) setPlayerItemsForQueue {
NSURL *fileURL = [[NSBundle mainBundle]
URLForResource:[self getStringForMode:self.currentMode] withExtension:#"mp4"];
NSMutableString *secondFile = [self getStringForMode:self.currentMode];
[secondFile appendString:#"Second"];
NSURL *secondFileUrl = [[NSBundle mainBundle] URLForResource:secondFile withExtension:#"mp4"];
AVAsset *firstAsset = [AVAsset assetWithURL:fileURL];
AVPlayerItem *firstVideoItem = [AVPlayerItem playerItemWithAsset:firstAsset];//[AVPlayerItem playerItemWithURL:fileURL];
AVAsset *secondAsset = [AVAsset assetWithURL:secondFileUrl];
AVPlayerItem *secondVideoItem = [AVPlayerItem playerItemWithAsset:secondAsset];//[AVPlayerItem playerItemWithURL:secondFileUrl];
self.queuePlayer = [AVQueuePlayer queuePlayerWithItems: [NSArray arrayWithObjects:firstVideoItem, secondVideoItem,nil]];
for (AVPlayerItem *playerItem in self.queuePlayer.items) {
[playerItem addObserver: self forKeyPath: #"status" options:0 context:NULL];
}
}
- (void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object
change:(NSDictionary *)change context:(void *)context {
dispatch_async(dispatch_get_main_queue(), ^{
if ([(AVPlayerItem *)object status] == AVPlayerItemStatusReadyToPlay) {
[self.queuePlayer play];
}
});
}
- (void) playerItemDidReachEnd:(NSNotification*) notification {
NSMutableString *secondFile = [self getStringForMode:self.currentMode];
[secondFile appendString:#"Second"];
NSURL *secondFileUrl = [[NSBundle mainBundle] URLForResource:secondFile withExtension:#"mp4"];
AVPlayerItem *playerItem = [[ AVPlayerItem alloc ] initWithURL:secondFileUrl];
if ([self.queuePlayer canInsertItem:playerItem afterItem:[notification object]]) {
[self.queuePlayer insertItem: playerItem afterItem: nil ];
}
}
Yes, multiple AVPlayer objects are your best option.
I solved this problem with a main timer function which monitored the current playback time of firstVideoItem, and when it was 0.01 seconds from the end, I would flip the players on the alpha channel, so secondVideoItem was visible, then play secondVideoItem.
I also used [secondVideoItem seekToTime:CMTimeMakeWithSeconds(1,1)] before issuing the play command to make sure the second video was initialised and ready to play. This helped a lot.

playing second Sound

am trying to do a simple AVAudioPlayer based application to play 2 different music upon button pressed, there is 2 views, the first is the home view contains 2 buttons,each button set a song which is named as integer(1.mp3, 2.mp3.....etc)and here'e the code
#import "podHome.h"
#import "podMusic.h"
#import "ArabAppDelegate.h"
#implementation podHome
#synthesize song1;
#synthesize tabi;
int CurrentPlay;
NSString *Currenttxt;
-(IBAction)uae{
CurrentPlay=1;
Currenttxt=#"uae";
podMusic *newContro=[[podMusic alloc] init];
[newContro setCurrentPlay1:CurrentPlay setCurrentText:Currenttxt];
ArabAppDelegate *theDelegate = (ArabAppDelegate*)[[UIApplication sharedApplication] delegate];
tabi = theDelegate.tabcontrolPod;
tabi.selectedIndex = 1;
[newContro release];
}
-(IBAction)libya{
CurrentPlay=2;
Currenttxt=#"uae";
podMusic *newContro=[[podMusic alloc] init];
[newContro setCurrentPlay1:CurrentPlay setCurrentText:Currenttxt];
ArabAppDelegate *theDelegate = (ArabAppDelegate*)[[UIApplication sharedApplication] delegate];
tabi = theDelegate.tabcontrolPod;
tabi.selectedIndex = 1;
[newContro release];
}
these tow (IBActions) are linked to the two buttons when pressing on one of them it will change to the other view and start playing the song
- (void)viewWillAppear:(BOOL)animated {
if((played == 1)&&(isBacked==FALSE)){
NSString *filePath = [[NSBundle mainBundle] pathForResource:texting
ofType:#"txt"];
NSString *filenameString = [NSString stringWithContentsOfFile:filePath usedEncoding:nil error:nil];
CurrentTex.text = filenameString;
AudioSessionInitialize(NULL, NULL, NULL, NULL);
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory,sizeof(sessionCategory), &sessionCategory);
AudioSessionSetActive(YES);
playBtnBG = [[UIImage imageNamed:#"play-pod.png"] retain];
pauseBtnBG = [[UIImage imageNamed:#"pause-pod.png"] retain];
[playButton setBackgroundImage:pauseBtnBG forState:UIControlStateNormal];
[self registerForBackgroundNotifications];
updateTimer = nil;
duration.adjustsFontSizeToFitWidth = YES;
currentTime.adjustsFontSizeToFitWidth = YES;
progressBar.minimumValue = 0.0;
NSString *path=[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"%d",CurrentPlay] ofType:#"mp3"];
self.player=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
[player stop];
//self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil];
if (self.player)
{
[self updateViewForPlayerInfo:player];
[self updateViewForPlayerState:player];
player.numberOfLoops = 0;
player.delegate = self;
}
[self startPlaybackForPlayer:player];
fileName.text= [[NSString alloc]initWithFormat:#"%#", songName];
// [fileURL release];
// CurrentPlay = 0;
isBacked = TRUE;
}
[super viewWillAppear:animated];
}
- (void)setCurrentPlay1:(int)varP setCurrentText:(NSString *)varT{
CurrentPlay = varP;
texting = varT;
played = 1;
isBacked = FALSE;
}
but the problem is that when the song is playing and am back to home view and pressing on the other song's button, it starts to play with the first one at the same time, i think the first should stop to begin the other, what should i release to do that???
My guess is that you have two instances of the AVAudioPlayer, and when you set them both to play, they both do!
One solution would be to tell other players to stop when you activate a new one, but that will quickly become troublesome as the number of players increases.
Instead you;'d be better of just setting up one player, and change it's song in accordance to what button was pressed. That way, there is no chance that two music tracks will play at the same time.
In your first button action write like this.
if(player2.isPlaying)
{
[player2 stop];
}
in the same way do the same thing in second button action like this.
if(player1.isPlaying)
{
[player1 stop];
}