I like to display and image and play corresponding audio file. But it is playing audio files first before displaying. I couldn't figure it what is wrong.
-(IBAction)playButton :(id)sender{
UIImage *imageA = [UIImage imageNamed:#"Image1.png"];
UIImage *imageB = [UIImage imageNamed:#"Image2.png"];
int randomAlphaNum = arc4random() % 2;
NSLog(#"%i", randomAlphaNum);
switch (randomAlphaNum) {
case 0:
imageView.image = imageA;
for (int i = 1; i <=5; i++) {
NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:#"%#/audioA.wav", [[NSBundle mainBundle] resourcePath]]];
NSError *error;
audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
audioPlayer.volume = 0.5;
if (audioPlayer == nil)
NSLog(#"An audio error occured: \"%#\"", audioPlayer);
else
{
[audioPlayer play];
}
sleep(2);
}
return;
break;
}
- (void)viewDidLoad
{
[super viewDidLoad];
}
- (void)viewDidUnload
{
[super viewDidUnload];
}
First of all I wouldn't recommend using sleep as it's going to chew up your main thread and make your UI unresponsive. I'd actually be curious if removing it alleviates your issue. There's a chance the app is flying through your code, playing the audio and is hitting the sleep command before the app can display the image on the screen. What happens if you remove sleep and only iterate through the for loop once?
Related
I am trying to record two videos with UIImagePickerController. Everything is working fine but while recording the second video seems it override the Path of first recorded video.
I need to upload both videos to the server but first video path got nil while uploading and app got crashed. Is there any way to record the second video at different path?
Video Path as follows:
/private/var/mobile/Containers/Data/Application/1465EC90-4B57-41FF-996E-0CCB7713ECE7/tmp/50332801315__A883E4DB-ED72-4D31-9564-22FB363779BD.MOV
/private/var/mobile/Containers/Data/Application/1465EC90-4B57-41FF-996E-0CCB7713ECE7/tmp/50332802324__F11733AD-EB62-426D-BA1C-7E87D2BF66D0.MOV
Here is my imagePicker delegate code:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
NSString *mediaType = [info objectForKey:UIImagePickerControllerMediaType];
if (CFStringCompare ((__bridge CFStringRef) mediaType, kUTTypeMovie, 0) == kCFCompareEqualTo) {
NSURL *videoUrl = (NSURL*)[info objectForKey:UIImagePickerControllerMediaURL];
NSString *moviePath = [videoUrl path];
if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath)) {
UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
}
NSLog(#"videoUrl: %#", videoUrl);
NSLog(#"moviePath: %#", moviePath);
// self.moviePath_1 = #"";
// self.moviePath_2 = #"";
NSLog(#"picker.title: %#", picker.title);
if ([picker.title isEqualToString:#"Video_1"]) {
self.moviePath_1 = moviePath;
self.video_1 = YES;
NSLog(#"self.moviePath_1: %#", self.moviePath_1);
self.video_1_Data = [NSData dataWithContentsOfURL:[NSURL fileURLWithPath:self.moviePath_1]];
NSLog(#"Video_1 Size: %#",[NSByteCountFormatter stringFromByteCount:self.video_1_Data.length countStyle:NSByteCountFormatterCountStyleFile]);
[self setupAndPlayback:#"Video_1"];
} else {
self.moviePath_2 = moviePath;
self.video_2 = YES;
NSLog(#"self.moviePath_2: %#", self.moviePath_2);
self.video_2_Data = [NSData dataWithContentsOfURL:[NSURL fileURLWithPath:self.moviePath_2]];
NSLog(#"Video_2 Size: %#",[NSByteCountFormatter stringFromByteCount:self.video_2_Data.length countStyle:NSByteCountFormatterCountStyleFile]);
[self setupAndPlayback:#"Video_2"];
}
}
[self dismissViewControllerAnimated:YES completion:nil];
}
Save Video on different Paths. You are overriding the same path that why this issue is happen. Add Time Stamp or increasing Number with Path and save it.
self.moviePath_1 = [NSString stringWithFormat: #"%#-%d.png", moviePath, num] ;
num += 1; // for next time
I'm wondering how to free memory in this simple program that plays a file through a buffer and then stops it.
-(void)setupAudioOne
{
NSError *error;
BOOL success = NO;
_player = [[AVAudioPlayerNode alloc] init];
NSURL *hiphopOneURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"Hip Hop 1" ofType:#"caf"]];
AVAudioFile *hiphopOneFile = [[AVAudioFile alloc] initForReading:hiphopOneURL error:&error];
_playerLoopBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:[hiphopOneFile processingFormat] frameCapacity:(AVAudioFrameCount)[hiphopOneFile length]];
success = [hiphopOneFile readIntoBuffer:_playerLoopBuffer error:&error];
_engine = [[AVAudioEngine alloc] init];
[_engine attachNode:_player];
AVAudioMixerNode *mainMixer = [_engine mainMixerNode];
AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_engine connect:_player to:mainMixer fromBus:0 toBus:0 format:stereoFormat];
[self startEngine];
}
Above is the general setup of the engine and the player node.
Then we implement the player with a play button:
- (IBAction)play:(id)sender {
if (!self.playerIsPlaying)
{
[self setupAudioOne];
[_player scheduleBuffer:_playerLoopBuffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:nil];
[_player play];
}
}
And finally we stop the player with a stop button:
- (IBAction)stopHipHopOne:(id)sender {
if (self.playerIsPlaying) {
[_player stop];
}
playerIsPlaying is just a simple BOOL that determines if the _player is playing.
So basically my question is, when you hit the stop button as this program is written now, no memory will be freed.
Surely there is a simple line of code that I can add to the stop button that frees up the memory the engine and player is using?
Any thoughts?
Yes, there is. After stopping the player node, you can call:
[_engine disconnectNodeInput:_player];
[_engine detachNode:_player];
I saw you're also keeping a reference to the audio buffer, so you might want to nil that one as well. Let me know if that doesn't work for you. Something else could be leaking.
I am using AVCaptureSession to capture video from a devices camera and then using AVAssetWriterInput and AVAssetTrack to compress/resize the video before uploading it to a server. The final videos will be viewed on the web via an html5 video element.
I'm running into multiple issues trying to get the orientation of the video correct. My app only supports landscape orientation and all captured videos should be in landscape orientation. However, I would like to allow the user to hold their device in either landscape direction (i.e. home button on either the left or the right hand side).
I am able to make the video preview show in the correct orientation with the following line of code
_previewLayer.connection.videoOrientation = UIDevice.currentDevice.orientation;
The problems start when processing the video via AVAssetWriterInput and friends. The result does not seem to account for the left vs. right landscape mode the video was captured in. IOW, sometimes the video comes out upside down. After some googling I found many people suggesting that the following line of code would solve this issue
writerInput.transform = videoTrack.preferredTransform;
...but this doesn't seem to work. After a bit of debugging I found that videoTrack.preferredTransform is always the same value, regardless of the orientation the video was captured in.
I tried manually tracking what orientation the video was captured in and setting the writerInput.transform to CGAffineTransformMakeRotation(M_PI) as needed. Which solved the problem!!!
...sorta
When I viewed the results on the device this solution worked as expected. Videos were right-side-up regardless of left vs. right orientation while recording. Unfortunately, when I viewed the exact same videos in another browser (chrome on a mac book) they were all upside-down!?!?!?
What am I doing wrong?
EDIT
Here's some code, in case it's helpful...
-(void)compressFile:(NSURL*)inUrl;
{
NSString* fileName = [#"compressed." stringByAppendingString:inUrl.lastPathComponent];
NSError* error;
NSURL* outUrl = [PlatformHelper getFilePath:fileName error:&error];
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 30] };
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings };
NSDictionary* videoOptions = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
AVAssetWriter* assetWriter = [AVAssetWriter assetWriterWithURL:outUrl fileType:AVFileTypeMPEG4 error:&error];
assetWriter.shouldOptimizeForNetworkUse = YES;
[assetWriter addInput:writerInput];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:inUrl options:nil];
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// !!! this line does not work as expected and causes all sorts of issues (videos display sideways in some cases) !!!
//writerInput.transform = videoTrack.preferredTransform;
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOptions];
AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
[assetReader addOutput:readerOutput];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[assetReader startReading];
[writerInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
/* snip */
}];
}
The problem is that modifying the writerInput.transform property only adds a tag in the video file metadata which instructs the video player to rotate the file during playback. That's why the videos play in the correct orientation on your device (I'm guessing they also play correctly in a Quicktime player as well).
The pixel buffers captured by the camera are still laid out in the orientation in which they were captured. Many video players will not check for the preferred orientation metadata tag and will just play the file in the native pixel orientation.
If you want the user to be able to record video holding the phone in either landscape mode, you need to rectify this at the AVCaptureSession level before compression by performing a transform on the CVPixelBuffer of each video frame. This Apple Q&A covers it (look at the AVCaptureVideoOutput documentation as well):
https://developer.apple.com/library/ios/qa/qa1744/_index.html
Investigating the link above is the correct way to solve your problem. An alternate fast n' dirty way to solve the same problem would be to lock the recording UI of your app into only one landscape orientation and then to rotate all of your videos server-side using ffmpeg.
In case it's helpful for anyone, here's the code I ended up with. I ended up having to do the work on the video as it was being captured instead of as a post processing step. This is a helper class that manages the capture.
Interface
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface VideoCaptureManager : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>
{
AVCaptureSession* _captureSession;
AVCaptureVideoPreviewLayer* _previewLayer;
AVCaptureVideoDataOutput* _videoOut;
AVCaptureDevice* _videoDevice;
AVCaptureDeviceInput* _videoIn;
dispatch_queue_t _videoProcessingQueue;
AVAssetWriter* _assetWriter;
AVAssetWriterInput* _writerInput;
BOOL _isCapturing;
NSString* _gameId;
NSString* _authToken;
}
-(void)setSettings:(NSString*)gameId authToken:(NSString*)authToken;
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
-(void)startPreview;
-(void)stopPreview;
-(void)startCapture;
-(void)stopCapture;
#end
Implementation (w/ a bit of editing and a few little TODO's)
#implementation VideoCaptureManager
-(id)init;
{
self = [super init];
if (self) {
NSError* error;
_videoProcessingQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
_captureSession = [AVCaptureSession new];
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
[_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
_videoOut = [AVCaptureVideoDataOutput new];
_videoOut.videoSettings = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
_videoOut.alwaysDiscardsLateVideoFrames = YES;
_videoIn = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error];
// handle errors here
[_captureSession addInput:_videoIn];
[_captureSession addOutput:_videoOut];
}
return self;
}
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
{
_previewLayer.connection.videoOrientation = orientation;
for (AVCaptureConnection* item in _videoOut.connections) {
item.videoOrientation = orientation;
}
}
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
{
return _previewLayer;
}
-(void)startPreview;
{
[_captureSession startRunning];
}
-(void)stopPreview;
{
[_captureSession stopRunning];
}
-(void)startCapture;
{
if (_isCapturing) return;
NSURL* url = put code here to create your output url
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 1],
};
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings
};
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
_writerInput.expectsMediaDataInRealTime = YES;
NSError* error;
_assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:&error];
// handle errors
_assetWriter.shouldOptimizeForNetworkUse = YES;
[_assetWriter addInput:_writerInput];
[_videoOut setSampleBufferDelegate:self queue:_videoProcessingQueue];
_isCapturing = YES;
}
-(void)stopCapture;
{
if (!_isCapturing) return;
[_videoOut setSampleBufferDelegate:nil queue:nil]; // TODO: seems like there could be a race condition between this line and the next (could end up trying to write a buffer after calling writingFinished
dispatch_async(_videoProcessingQueue, ^{
[_assetWriter finishWritingWithCompletionHandler:^{
[self writingFinished];
}];
});
}
-(void)writingFinished;
{
// TODO: need to check _assetWriter.status to make sure everything completed successfully
// do whatever post processing you need here
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection;
{
NSLog(#"Video frame was dropped.");
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if(_assetWriter.status != AVAssetWriterStatusWriting) {
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[_assetWriter startWriting]; // TODO: need to check the return value (a bool)
[_assetWriter startSessionAtSourceTime:lastSampleTime];
}
if (!_writerInput.readyForMoreMediaData || ![_writerInput appendSampleBuffer:sampleBuffer]) {
NSLog(#"Failed to write video buffer to output.");
}
}
#end
For compressing /Resizing the video ,we can use AVAssetExportSession.
We can uppload a video of duration 3.30minutes.
If the video duration will be more than 3.30minutes,it will show a memory warning .
As here we are not using any transform for the video,the video will be as it is while recording.
Below is the sample code for compressing the video .
we can check the video size before compression and after compression.
{
-(void)trimVideoWithURL:(NSURL *)inputURL{
NSString *path1 = [inputURL path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path1];
NSLog(#"size before compress video is %lu",(unsigned long)data.length);
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset640x480];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = paths[0];
NSFileManager *manager = [NSFileManager defaultManager];
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
outputURL = [outputURL stringByAppendingPathComponent:#"output.mp4"];
fullPath = [NSURL URLWithString:outputURL];
// Remove Existing File
[manager removeItemAtPath:outputURL error:nil];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(1.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSString *path = [fullPath path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path];
NSLog(#"size after compress video is %lu",(unsigned long)data.length);
NSLog(#"Export Complete %d %#", exportSession.status, exportSession.error);
/*
Do your neccessay stuff here after compression
*/
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
}];}
I'm working over application, that using zxing library to read QRcodes. I have problem with ZxingWidgetController - when view is showed, during application is in background/not active (eg. screen is lock) image from camera is not shown on screen - only background is visible, and scanner seems to be not working.
when i call initCapture method again, after a little delay video from camera is showed, but in this case, every time when application lose activity i need to reinitialize scanner - this behavior is not comfortable at all.
this bug can be repeated on almost all aplication used zXing, so i suppose that is some zXing bug.
zXing initCapture method code is:
- (void)initCapture {
#if HAS_AVFF
AVCaptureDeviceInput *captureInput =
[AVCaptureDeviceInput deviceInputWithDevice:
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]
error:nil];
if(!captureInput)
{
NSLog(#"ERROR - CaptureInputNotInitialized");
}
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
if(!captureOutput)
{
NSLog(#"ERROR - CaptureOutputNotInitialized");
}
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; // 480x360 on a 4
if([self.captureSession canAddInput:captureInput])
{
[self.captureSession addInput:captureInput];
}
else
{
NSLog(#"ERROR - cannot add input");
}
if([self.captureSession canAddOutput:captureOutput])
{
[self.captureSession addOutput:captureOutput];
}
else
{
NSLog(#"ERROR - cannot add output");
}
[captureOutput release];
if (!self.prevLayer)
{
[self.prevLayer release];
}
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
// NSLog(#"prev %p %#", self.prevLayer, self.prevLayer);
self.prevLayer.frame = self.view.bounds;
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
#endif
}
Maybe you guys know what is wrong?
I dont understand your question. If application is in background/not active, of course it cant working. You should make it clear.
am trying to do a simple AVAudioPlayer based application to play 2 different music upon button pressed, there is 2 views, the first is the home view contains 2 buttons,each button set a song which is named as integer(1.mp3, 2.mp3.....etc)and here'e the code
#import "podHome.h"
#import "podMusic.h"
#import "ArabAppDelegate.h"
#implementation podHome
#synthesize song1;
#synthesize tabi;
int CurrentPlay;
NSString *Currenttxt;
-(IBAction)uae{
CurrentPlay=1;
Currenttxt=#"uae";
podMusic *newContro=[[podMusic alloc] init];
[newContro setCurrentPlay1:CurrentPlay setCurrentText:Currenttxt];
ArabAppDelegate *theDelegate = (ArabAppDelegate*)[[UIApplication sharedApplication] delegate];
tabi = theDelegate.tabcontrolPod;
tabi.selectedIndex = 1;
[newContro release];
}
-(IBAction)libya{
CurrentPlay=2;
Currenttxt=#"uae";
podMusic *newContro=[[podMusic alloc] init];
[newContro setCurrentPlay1:CurrentPlay setCurrentText:Currenttxt];
ArabAppDelegate *theDelegate = (ArabAppDelegate*)[[UIApplication sharedApplication] delegate];
tabi = theDelegate.tabcontrolPod;
tabi.selectedIndex = 1;
[newContro release];
}
these tow (IBActions) are linked to the two buttons when pressing on one of them it will change to the other view and start playing the song
- (void)viewWillAppear:(BOOL)animated {
if((played == 1)&&(isBacked==FALSE)){
NSString *filePath = [[NSBundle mainBundle] pathForResource:texting
ofType:#"txt"];
NSString *filenameString = [NSString stringWithContentsOfFile:filePath usedEncoding:nil error:nil];
CurrentTex.text = filenameString;
AudioSessionInitialize(NULL, NULL, NULL, NULL);
UInt32 sessionCategory = kAudioSessionCategory_MediaPlayback;
AudioSessionSetProperty(kAudioSessionProperty_AudioCategory,sizeof(sessionCategory), &sessionCategory);
AudioSessionSetActive(YES);
playBtnBG = [[UIImage imageNamed:#"play-pod.png"] retain];
pauseBtnBG = [[UIImage imageNamed:#"pause-pod.png"] retain];
[playButton setBackgroundImage:pauseBtnBG forState:UIControlStateNormal];
[self registerForBackgroundNotifications];
updateTimer = nil;
duration.adjustsFontSizeToFitWidth = YES;
currentTime.adjustsFontSizeToFitWidth = YES;
progressBar.minimumValue = 0.0;
NSString *path=[[NSBundle mainBundle] pathForResource:[NSString stringWithFormat:#"%d",CurrentPlay] ofType:#"mp3"];
self.player=[[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:NULL];
[player stop];
//self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:fileURL error:nil];
if (self.player)
{
[self updateViewForPlayerInfo:player];
[self updateViewForPlayerState:player];
player.numberOfLoops = 0;
player.delegate = self;
}
[self startPlaybackForPlayer:player];
fileName.text= [[NSString alloc]initWithFormat:#"%#", songName];
// [fileURL release];
// CurrentPlay = 0;
isBacked = TRUE;
}
[super viewWillAppear:animated];
}
- (void)setCurrentPlay1:(int)varP setCurrentText:(NSString *)varT{
CurrentPlay = varP;
texting = varT;
played = 1;
isBacked = FALSE;
}
but the problem is that when the song is playing and am back to home view and pressing on the other song's button, it starts to play with the first one at the same time, i think the first should stop to begin the other, what should i release to do that???
My guess is that you have two instances of the AVAudioPlayer, and when you set them both to play, they both do!
One solution would be to tell other players to stop when you activate a new one, but that will quickly become troublesome as the number of players increases.
Instead you;'d be better of just setting up one player, and change it's song in accordance to what button was pressed. That way, there is no chance that two music tracks will play at the same time.
In your first button action write like this.
if(player2.isPlaying)
{
[player2 stop];
}
in the same way do the same thing in second button action like this.
if(player1.isPlaying)
{
[player1 stop];
}