I just need to listen to the file slowly or quickly. Any help will be greatly appreciated
AVAudioplayer can't do that, AVPlayer can.
// play audio file as NSURL *url;
NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:NO]
forKey:AVURLAssetPreferPreciseDurationAndTimingKey];
AVURLAsset *urlAsset = [AVURLAsset URLAssetWithURL:url options:options];
NSString *tracksKey = #"tracks";
[urlAsset loadValuesAsynchronouslyForKeys:[NSArray arrayWithObject:tracksKey] completionHandler:
^{
// Completion handler block.
dispatch_async(dispatch_get_main_queue(),
^{
NSError *error = nil;
AVKeyValueStatus status = [urlAsset statusOfValueForKey:tracksKey error:&error];
if (status == AVKeyValueStatusLoaded) {
self.avPlayer = [AVPlayer playerWithPlayerItem:playerItem];
[avPlayer play];
avPlayer.rate = 0.5;
}
else {
// You should deal with the error appropriately.
NSLog(#"The asset's tracks were not loaded:\n%#", [error localizedDescription]);
}
});
}];
thanks but I have solved it as follows:
Install cocos2d http://www.cocos2d-iphone.org/download
[[SimpleAudioEngine sharedEngine] playEffect: (NSString *) pitch (float32) pan (float32) gain (float32)]
its simple and works....
in Swift 3.0 it will be like this -
audioP = try! AVAudioPlayer(contentsOf: URL(fileURLWithPath: selectedPath), fileTypeHint: "caf")
audioP.enableRate = true
audioP.prepareToPlay()
audioP.rate = 1.5
audioP.play()
Hope this helps :)
Related
I've been trying to play music in my SpriteKit game and used the AVAudioPlayerNode class to do so via AVAudioPCMBuffers. Every time I exported my OS X project, it would crash and give me an error regarding audio playback. After banging my head against the wall for the last 24 hours I decided to re-watch WWDC session 501 (see 54:17). My solution to this problem was what the presenter used, which is to break the frames of the buffer into smaller pieces to break up the audio file being read.
NSError *error = nil;
NSURL *someFileURL = ...
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading: someFileURL commonFormat: AVAudioPCMFormatFloat32 interleaved: NO error:&error];
const AVAudioFrameCount kBufferFrameCapacity = 128 * 1024L;
AVAudioFramePosition fileLength = audioFile.length;
AVAudioPCMBuffer *readBuffer = [[AvAudioPCMBuffer alloc] initWithPCMFormat: audioFile.processingFormat frameCapacity: kBufferFrameCapacity];
while (audioFile.framePosition < fileLength) {
AVAudioFramePosition readPosition = audioFile.framePosition;
if (![audioFile readIntoBuffer: readBuffer error: &error])
return NO;
if (readBuffer.frameLength == 0) //end of file reached
break;
}
My current problem is that the player only plays the last frame read into the buffer. The music that I'm playing is only 2 minutes long. Apparently, this is too long to just read into the buffer outright. Is the buffer being overwritten every time the readIntoBuffer: method is called inside the loop? I'm such a noob at this stuff...how can I get the entire file played?
If I can't get this to work, what is a good way to play music (2 different files) across multiple SKScenes?
This is the solution that I came up with. It's still not perfect, but hopefully it will help someone who is in the same predicament that I've found myself in. I created a singleton class to handle this job. One improvement that can be made in the future is to only load sound effects and music files needed for a particular SKScene at the time they are needed. I had so many issues with this code that I don't want to mess with it now. Currently, I don't have too many sounds, so it's not using an excessive amount of memory.
Overview
My strategy was the following:
Store the audio file names for the game in a plist
Read from that plist and create two dictionaries (one for music and one for short sound effects)
The sound effect dictionary is composed of a AVAudioPCMBuffer and a AVAudioPlayerNode for each of the sounds
The music dictionary is compose of an array of AVAudioPCMBuffers, an array of timestamps for when those buffers should be played in queue, a AVAudioPlayerNode and the sample rate of the original audio file
The sample rate is necessary for figuring out the time at which each buffer should be played (you'll see the calculations done in code)
Create an AVAudioEngine and get the main mixer from the engine and attach all AVAudioPlayerNodes to the mixer (as per usual)
Play sound effects or music using their various methods
sound effect playing is straightforward...call method -(void) playSfxFile:(NSString*)file;
and it plays a sound
for music, I just couldn't find a good solution without invoking the help of the scene trying to play the music. The scene will call -(void) playMusicFile:(NSString*)file;and it will schedule the buffers to play in order that they were created. I couldn't find a good way to get the music to repeat once completed within my AudioEngine class so I decided to get the scene to check in its update: method whether or not the music was playing for a particular file and if not, play it again (not a very slick solution, but it works)
AudioEngine.h
#import <Foundation/Foundation.h>
#interface AudioEngine : NSObject
+(instancetype)sharedData;
-(void) playSfxFile:(NSString*)file;
-(void) playMusicFile:(NSString*)file;
-(void) pauseMusic:(NSString*)file;
-(void) unpauseMusic:(NSString*)file;
-(void) stopMusicFile:(NSString*)file;
-(void) setVolumePercentages;
-(bool) isPlayingMusic:(NSString*)file;
#end
AudioEngine.m
#import "AudioEngine.h"
#import <AVFoundation/AVFoundation.h>
#import "GameData.h" //this is a class that I use to store game data (in this case it is being used to get the user preference for volume amount)
#interface AudioEngine()
#property AVAudioEngine *engine;
#property AVAudioMixerNode *mixer;
#property NSMutableDictionary *musicDict;
#property NSMutableDictionary *sfxDict;
#property NSString *audioInfoPList;
#property float musicVolumePercent;
#property float sfxVolumePercent;
#property float fadeVolume;
#property float timerCount;
#end
#implementation AudioEngine
int const FADE_ITERATIONS = 10;
static NSString * const MUSIC_PLAYER = #"player";
static NSString * const MUSIC_BUFFERS = #"buffers";
static NSString * const MUSIC_FRAME_POSITIONS = #"framePositions";
static NSString * const MUSIC_SAMPLE_RATE = #"sampleRate";
static NSString * const SFX_BUFFER = #"buffer";
static NSString * const SFX_PLAYER = #"player";
+(instancetype) sharedData {
static AudioEngine *sharedInstance = nil;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
sharedInstance = [[self alloc] init];
[sharedInstance startEngine];
});
return sharedInstance;
}
-(instancetype) init {
if (self = [super init]) {
_engine = [[AVAudioEngine alloc] init];
_mixer = [_engine mainMixerNode];
_audioInfoPList = [[NSBundle mainBundle] pathForResource:#"AudioInfo" ofType:#"plist"]; //open a plist called AudioInfo.plist
[self setVolumePercentages]; //this is created to set the user's preference in terms of how loud sound fx and music should be played
[self initMusic];
[self initSfx];
}
return self;
}
//opens all music files, creates multiple buffers depending on the length of the file and a player
-(void) initMusic {
_musicDict = [NSMutableDictionary dictionary];
_audioInfoPList = [[NSBundle mainBundle] pathForResource: #"AudioInfo" ofType: #"plist"];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *musicFileName in audioInfoData[#"music"]) {
[self loadMusicIntoBuffer:musicFileName];
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
AVAudioPCMBuffer *buffer = [[_musicDict[musicFileName] objectForKey:MUSIC_BUFFERS] objectAtIndex:0];
[_engine connect:player to:_mixer format:buffer.format];
[_musicDict[musicFileName] setObject:player forKey:#"player"];
}
}
//opens a music file and creates an array of buffers
-(void) loadMusicIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:filename withExtension:#"aif"];
//NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"aif"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioFramePosition fileLength = audioFile.length; //frame length of the audio file
float sampleRate = audioFile.fileFormat.sampleRate; //sample rate (in Hz) of the audio file
[_musicDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_musicDict[filename] setObject:[NSNumber numberWithDouble:sampleRate] forKey:MUSIC_SAMPLE_RATE];
NSMutableArray *buffers = [NSMutableArray array];
NSMutableArray *framePositions = [NSMutableArray array];
const AVAudioFrameCount kBufferFrameCapacity = 1024 * 1024L; //the size of my buffer...can be made bigger or smaller 512 * 1024L would be half the size
while (audioFile.framePosition < fileLength) { //each iteration reads in kBufferFrameCapacity frames of the audio file and stores it in a buffer
[framePositions addObject:[NSNumber numberWithLongLong:audioFile.framePosition]];
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:kBufferFrameCapacity];
if (![audioFile readIntoBuffer:readBuffer error:&error]) {
NSLog(#"failed to read audio file: %#", error);
return;
}
if (readBuffer.frameLength == 0) { //if we've come to the end of the file, end the loop
break;
}
[buffers addObject:readBuffer];
}
[_musicDict[filename] setObject:buffers forKey:MUSIC_BUFFERS];
[_musicDict[filename] setObject:framePositions forKey:MUSIC_FRAME_POSITIONS];
}
-(void) initSfx {
_sfxDict = [NSMutableDictionary dictionary];
NSDictionary *audioInfoData = [NSDictionary dictionaryWithContentsOfFile:_audioInfoPList];
for (NSString *sfxFileName in audioInfoData[#"sfx"]) {
AVAudioPlayerNode *player = [[AVAudioPlayerNode alloc] init];
[_engine attachNode:player];
[self loadSoundIntoBuffer:sfxFileName];
AVAudioPCMBuffer *buffer = [_sfxDict[sfxFileName] objectForKey:SFX_BUFFER];
[_engine connect:player to:_mixer format:buffer.format];
[_sfxDict[sfxFileName] setObject:player forKey:SFX_PLAYER];
}
}
//WARNING: make sure that the sound fx file is small (roughly under 30 sec) otherwise the archived version of the app will crash because the buffer ran out of space
-(void) loadSoundIntoBuffer:(NSString *)filename
{
NSURL *audioFileURL = [NSURL URLWithString:[[NSBundle mainBundle] pathForResource:filename ofType:#"mp3"]];
NSAssert(audioFileURL, #"Error creating URL to audio file");
NSError *error = nil;
AVAudioFile *audioFile = [[AVAudioFile alloc] initForReading:audioFileURL commonFormat:AVAudioPCMFormatFloat32 interleaved:NO error:&error];
NSAssert(audioFile != nil, #"Error creating audioFile, %#", error.localizedDescription);
AVAudioPCMBuffer *readBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:audioFile.processingFormat frameCapacity:(AVAudioFrameCount)audioFile.length];
[audioFile readIntoBuffer:readBuffer error:&error];
[_sfxDict setObject:[NSMutableDictionary dictionary] forKey:filename];
[_sfxDict[filename] setObject:readBuffer forKey:SFX_BUFFER];
}
-(void)startEngine {
[_engine startAndReturnError:nil];
}
-(void) playSfxFile:(NSString*)file {
AVAudioPlayerNode *player = [_sfxDict[file] objectForKey:#"player"];
AVAudioPCMBuffer *buffer = [_sfxDict[file] objectForKey:SFX_BUFFER];
[player scheduleBuffer:buffer atTime:nil options:AVAudioPlayerNodeBufferInterrupts completionHandler:nil];
[player setVolume:1.0];
[player setVolume:_sfxVolumePercent];
[player play];
}
-(void) playMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying] == NO) {
NSArray *buffers = [_musicDict[file] objectForKey:MUSIC_BUFFERS];
double sampleRate = [[_musicDict[file] objectForKey:MUSIC_SAMPLE_RATE] doubleValue];
for (int i = 0; i < [buffers count]; i++) {
long long framePosition = [[[_musicDict[file] objectForKey:MUSIC_FRAME_POSITIONS] objectAtIndex:i] longLongValue];
AVAudioTime *time = [AVAudioTime timeWithSampleTime:framePosition atRate:sampleRate];
AVAudioPCMBuffer *buffer = [buffers objectAtIndex:i];
[player scheduleBuffer:buffer atTime:time options:AVAudioPlayerNodeBufferInterrupts completionHandler:^{
if (i == [buffers count] - 1) {
[player stop];
}
}];
[player setVolume:_musicVolumePercent];
[player play];
}
}
}
-(void) stopOtherMusicPlayersNotNamed:(NSString*)file {
if ([file isEqualToString:#"menuscenemusic"]) {
AVAudioPlayerNode *player = [_musicDict[#"levelscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
else {
AVAudioPlayerNode *player = [_musicDict[#"menuscenemusic"] objectForKey:MUSIC_PLAYER];
[player stop];
}
}
//stops the player for a particular sound
-(void) stopMusicFile:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
_timerCount = FADE_ITERATIONS;
_fadeVolume = _musicVolumePercent;
[self fadeOutMusicForPlayer:player]; //fade out the music
}
}
//helper method for stopMusicFile:
-(void) fadeOutMusicForPlayer:(AVAudioPlayerNode*)player {
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selector:#selector(handleTimer:) userInfo:player repeats:YES];
}
//helper method for stopMusicFile:
-(void) handleTimer:(NSTimer*)timer {
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
if (_timerCount > 0) {
_timerCount--;
AVAudioPlayerNode *player = (AVAudioPlayerNode*)timer.userInfo;
_fadeVolume = _musicVolumePercent * (_timerCount / FADE_ITERATIONS);
[player setVolume:_fadeVolume];
}
else {
[player stop];
[player setVolume:_musicVolumePercent];
[timer invalidate];
}
}
-(void) pauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying]) {
[player pause];
}
}
-(void) unpauseMusic:(NSString*)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player play];
}
//sets the volume of the player based on user preferences in GameData class
-(void) setVolumePercentages {
NSString *musicVolumeString = [[GameData sharedGameData].settings objectForKey:#"musicVolume"];
_musicVolumePercent = [[[musicVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
NSString *sfxVolumeString = [[GameData sharedGameData].settings objectForKey:#"sfxVolume"];
_sfxVolumePercent = [[[sfxVolumeString componentsSeparatedByCharactersInSet:
[[NSCharacterSet decimalDigitCharacterSet] invertedSet]]
componentsJoinedByString:#""] floatValue] / 100;
//immediately sets music to new volume
for (NSString *file in [_musicDict allKeys]) {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
[player setVolume:_musicVolumePercent];
}
}
-(bool) isPlayingMusic:(NSString *)file {
AVAudioPlayerNode *player = [_musicDict[file] objectForKey:MUSIC_PLAYER];
if ([player isPlaying])
return YES;
return NO;
}
#end
I am using AVCaptureSession to capture video from a devices camera and then using AVAssetWriterInput and AVAssetTrack to compress/resize the video before uploading it to a server. The final videos will be viewed on the web via an html5 video element.
I'm running into multiple issues trying to get the orientation of the video correct. My app only supports landscape orientation and all captured videos should be in landscape orientation. However, I would like to allow the user to hold their device in either landscape direction (i.e. home button on either the left or the right hand side).
I am able to make the video preview show in the correct orientation with the following line of code
_previewLayer.connection.videoOrientation = UIDevice.currentDevice.orientation;
The problems start when processing the video via AVAssetWriterInput and friends. The result does not seem to account for the left vs. right landscape mode the video was captured in. IOW, sometimes the video comes out upside down. After some googling I found many people suggesting that the following line of code would solve this issue
writerInput.transform = videoTrack.preferredTransform;
...but this doesn't seem to work. After a bit of debugging I found that videoTrack.preferredTransform is always the same value, regardless of the orientation the video was captured in.
I tried manually tracking what orientation the video was captured in and setting the writerInput.transform to CGAffineTransformMakeRotation(M_PI) as needed. Which solved the problem!!!
...sorta
When I viewed the results on the device this solution worked as expected. Videos were right-side-up regardless of left vs. right orientation while recording. Unfortunately, when I viewed the exact same videos in another browser (chrome on a mac book) they were all upside-down!?!?!?
What am I doing wrong?
EDIT
Here's some code, in case it's helpful...
-(void)compressFile:(NSURL*)inUrl;
{
NSString* fileName = [#"compressed." stringByAppendingString:inUrl.lastPathComponent];
NSError* error;
NSURL* outUrl = [PlatformHelper getFilePath:fileName error:&error];
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 30] };
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings };
NSDictionary* videoOptions = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
AVAssetWriterInput* writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
writerInput.expectsMediaDataInRealTime = YES;
AVAssetWriter* assetWriter = [AVAssetWriter assetWriterWithURL:outUrl fileType:AVFileTypeMPEG4 error:&error];
assetWriter.shouldOptimizeForNetworkUse = YES;
[assetWriter addInput:writerInput];
AVURLAsset* asset = [AVURLAsset URLAssetWithURL:inUrl options:nil];
AVAssetTrack* videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];
// !!! this line does not work as expected and causes all sorts of issues (videos display sideways in some cases) !!!
//writerInput.transform = videoTrack.preferredTransform;
AVAssetReaderTrackOutput* readerOutput = [AVAssetReaderTrackOutput assetReaderTrackOutputWithTrack:videoTrack outputSettings:videoOptions];
AVAssetReader* assetReader = [AVAssetReader assetReaderWithAsset:asset error:&error];
[assetReader addOutput:readerOutput];
[assetWriter startWriting];
[assetWriter startSessionAtSourceTime:kCMTimeZero];
[assetReader startReading];
[writerInput requestMediaDataWhenReadyOnQueue:_processingQueue usingBlock:
^{
/* snip */
}];
}
The problem is that modifying the writerInput.transform property only adds a tag in the video file metadata which instructs the video player to rotate the file during playback. That's why the videos play in the correct orientation on your device (I'm guessing they also play correctly in a Quicktime player as well).
The pixel buffers captured by the camera are still laid out in the orientation in which they were captured. Many video players will not check for the preferred orientation metadata tag and will just play the file in the native pixel orientation.
If you want the user to be able to record video holding the phone in either landscape mode, you need to rectify this at the AVCaptureSession level before compression by performing a transform on the CVPixelBuffer of each video frame. This Apple Q&A covers it (look at the AVCaptureVideoOutput documentation as well):
https://developer.apple.com/library/ios/qa/qa1744/_index.html
Investigating the link above is the correct way to solve your problem. An alternate fast n' dirty way to solve the same problem would be to lock the recording UI of your app into only one landscape orientation and then to rotate all of your videos server-side using ffmpeg.
In case it's helpful for anyone, here's the code I ended up with. I ended up having to do the work on the video as it was being captured instead of as a post processing step. This is a helper class that manages the capture.
Interface
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#interface VideoCaptureManager : NSObject<AVCaptureVideoDataOutputSampleBufferDelegate>
{
AVCaptureSession* _captureSession;
AVCaptureVideoPreviewLayer* _previewLayer;
AVCaptureVideoDataOutput* _videoOut;
AVCaptureDevice* _videoDevice;
AVCaptureDeviceInput* _videoIn;
dispatch_queue_t _videoProcessingQueue;
AVAssetWriter* _assetWriter;
AVAssetWriterInput* _writerInput;
BOOL _isCapturing;
NSString* _gameId;
NSString* _authToken;
}
-(void)setSettings:(NSString*)gameId authToken:(NSString*)authToken;
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
-(void)startPreview;
-(void)stopPreview;
-(void)startCapture;
-(void)stopCapture;
#end
Implementation (w/ a bit of editing and a few little TODO's)
#implementation VideoCaptureManager
-(id)init;
{
self = [super init];
if (self) {
NSError* error;
_videoProcessingQueue = dispatch_queue_create("VideoQueue", DISPATCH_QUEUE_SERIAL);
_captureSession = [AVCaptureSession new];
_videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
_previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:_captureSession];
[_previewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
_videoOut = [AVCaptureVideoDataOutput new];
_videoOut.videoSettings = #{ (id)kCVPixelBufferPixelFormatTypeKey: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange] };
_videoOut.alwaysDiscardsLateVideoFrames = YES;
_videoIn = [AVCaptureDeviceInput deviceInputWithDevice:_videoDevice error:&error];
// handle errors here
[_captureSession addInput:_videoIn];
[_captureSession addOutput:_videoOut];
}
return self;
}
-(void)setOrientation:(AVCaptureVideoOrientation)orientation;
{
_previewLayer.connection.videoOrientation = orientation;
for (AVCaptureConnection* item in _videoOut.connections) {
item.videoOrientation = orientation;
}
}
-(AVCaptureVideoPreviewLayer*)getPreviewLayer;
{
return _previewLayer;
}
-(void)startPreview;
{
[_captureSession startRunning];
}
-(void)stopPreview;
{
[_captureSession stopRunning];
}
-(void)startCapture;
{
if (_isCapturing) return;
NSURL* url = put code here to create your output url
NSDictionary* compressionSettings = #{ AVVideoProfileLevelKey: AVVideoProfileLevelH264Main31,
AVVideoAverageBitRateKey: [NSNumber numberWithInt:2500000],
AVVideoMaxKeyFrameIntervalKey: [NSNumber numberWithInt: 1],
};
NSDictionary* videoSettings = #{ AVVideoCodecKey: AVVideoCodecH264,
AVVideoWidthKey: [NSNumber numberWithInt:1280],
AVVideoHeightKey: [NSNumber numberWithInt:720],
AVVideoScalingModeKey: AVVideoScalingModeResizeAspectFill,
AVVideoCompressionPropertiesKey: compressionSettings
};
_writerInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:videoSettings];
_writerInput.expectsMediaDataInRealTime = YES;
NSError* error;
_assetWriter = [AVAssetWriter assetWriterWithURL:url fileType:AVFileTypeMPEG4 error:&error];
// handle errors
_assetWriter.shouldOptimizeForNetworkUse = YES;
[_assetWriter addInput:_writerInput];
[_videoOut setSampleBufferDelegate:self queue:_videoProcessingQueue];
_isCapturing = YES;
}
-(void)stopCapture;
{
if (!_isCapturing) return;
[_videoOut setSampleBufferDelegate:nil queue:nil]; // TODO: seems like there could be a race condition between this line and the next (could end up trying to write a buffer after calling writingFinished
dispatch_async(_videoProcessingQueue, ^{
[_assetWriter finishWritingWithCompletionHandler:^{
[self writingFinished];
}];
});
}
-(void)writingFinished;
{
// TODO: need to check _assetWriter.status to make sure everything completed successfully
// do whatever post processing you need here
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection*)connection;
{
NSLog(#"Video frame was dropped.");
}
-(void)captureOutput:(AVCaptureOutput*)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection
{
if(_assetWriter.status != AVAssetWriterStatusWriting) {
CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);
[_assetWriter startWriting]; // TODO: need to check the return value (a bool)
[_assetWriter startSessionAtSourceTime:lastSampleTime];
}
if (!_writerInput.readyForMoreMediaData || ![_writerInput appendSampleBuffer:sampleBuffer]) {
NSLog(#"Failed to write video buffer to output.");
}
}
#end
For compressing /Resizing the video ,we can use AVAssetExportSession.
We can uppload a video of duration 3.30minutes.
If the video duration will be more than 3.30minutes,it will show a memory warning .
As here we are not using any transform for the video,the video will be as it is while recording.
Below is the sample code for compressing the video .
we can check the video size before compression and after compression.
{
-(void)trimVideoWithURL:(NSURL *)inputURL{
NSString *path1 = [inputURL path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path1];
NSLog(#"size before compress video is %lu",(unsigned long)data.length);
AVURLAsset *asset = [AVURLAsset URLAssetWithURL:inputURL options:nil];
AVAssetExportSession *exportSession = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPreset640x480];
NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *outputURL = paths[0];
NSFileManager *manager = [NSFileManager defaultManager];
[manager createDirectoryAtPath:outputURL withIntermediateDirectories:YES attributes:nil error:nil];
outputURL = [outputURL stringByAppendingPathComponent:#"output.mp4"];
fullPath = [NSURL URLWithString:outputURL];
// Remove Existing File
[manager removeItemAtPath:outputURL error:nil];
exportSession.outputURL = [NSURL fileURLWithPath:outputURL];
exportSession.shouldOptimizeForNetworkUse = YES;
exportSession.outputFileType = AVFileTypeQuickTimeMovie;
CMTime start = CMTimeMakeWithSeconds(1.0, 600);
CMTime duration = CMTimeMakeWithSeconds(1.0, 600);
CMTimeRange range = CMTimeRangeMake(start, duration);
exportSession.timeRange = range;
[exportSession exportAsynchronouslyWithCompletionHandler:^(void)
{
switch (exportSession.status) {
case AVAssetExportSessionStatusCompleted:{
NSString *path = [fullPath path];
NSData *data = [[NSFileManager defaultManager] contentsAtPath:path];
NSLog(#"size after compress video is %lu",(unsigned long)data.length);
NSLog(#"Export Complete %d %#", exportSession.status, exportSession.error);
/*
Do your neccessay stuff here after compression
*/
}
break;
case AVAssetExportSessionStatusFailed:
NSLog(#"Failed:%#",exportSession.error);
break;
case AVAssetExportSessionStatusCancelled:
NSLog(#"Canceled:%#",exportSession.error);
break;
default:
break;
}
}];}
i want to take a screenshot of a video on my ipad app.
I searched on SO, and i found a lot of sample code. I tried everything but nothing seem to work.
I tried all of this methods:
1) Try with : MPMoviePlayerController
- (void) previewWithPlayer:(NSString*)path image:(UIImageView*)imView
{
MPMoviePlayerController *player = [[MPMoviePlayerController alloc] initWithContentURL:[NSURL URLWithString:path]];
UIImage *thumbnail = [player thumbnailImageAtTime:1.0 timeOption:MPMovieTimeOptionNearestKeyFrame];
[player stop];
[player release];
imView.image = thumbnail;
}
2) Try with : AVAssetImageGenerator - v1
- (void) generateImage:(NSString*)path image:(UIImageView*)imView
{
AVAsset *asset = [AVAsset assetWithURL:[NSURL URLWithString:path]];
AVAssetImageGenerator *imageGenerator = [[AVAssetImageGenerator alloc]initWithAsset:asset];
CMTime time = CMTimeMake(1, 1);
UIImage *thumbnail = [UIImage imageWithCGImage:[imageGenerator copyCGImageAtTime:time actualTime:NULL error:NULL]];
imView.image = thumbnail;
}
13) Try with : AVAssetImageGenerator - v2
- (void) generateImage:(NSString*)path image:(UIImageView*)imView
{
AVURLAsset *asset = [[AVURLAsset alloc] initWithURL:[NSURL URLWithString:path] options:nil];
AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc] initWithAsset:asset];
generator.appliesPreferredTrackTransform=TRUE;
[asset release];
CMTime thumbTime = CMTimeMakeWithSeconds(2,30);
AVAssetImageGeneratorCompletionHandler handler = ^(CMTime requestedTime, CGImageRef im, CMTime actualTime, AVAssetImageGeneratorResult result, NSError *error){
if (result != AVAssetImageGeneratorSucceeded) {
NSLog(#"couldn't generate thumbnail, error:%#", error);
}
UIImage *thumbImg = [[UIImage imageWithCGImage:im] retain];
imView.image = thumbImg;
[generator release];
};
CGSize maxSize = CGSizeMake(320, 180);
generator.maximumSize = maxSize;
[generator generateCGImagesAsynchronouslyForTimes:[NSArray arrayWithObject:[NSValue valueWithCMTime:thumbTime]] completionHandler:handler];
}
But nothig works.
I tried with MOV, MP4, nothing.
Path is correct, video is working.
NSString *fPath = [[NSBundle mainBundle] pathForResource:#"VideoA" ofType:#"mp4"];
NSLog(#"%#", fPath);
[self generateImage:fPath image:_ImgA];
What's could be the problem? My image view show nothing and no error are returned.
iOS is 6.0/5.1, on iPad simulator/device.
Video is 854×480 pixels, H.264, AAC. About 30Mb of size.
please help me because i'm going crazy with this issue.
thanks.
edit
on device return this error:
couldn't generate thumbnail, error:Error Domain=NSURLErrorDomain
Code=-1 "unknown error" UserInfo=0x1e0a2f30
{NSUnderlyingError=0x1e0a3900 "The operation couldn’t be completed.
(OSStatus error -12935.)", NSLocalizedDescription=unknown error}
Solved.
The trick: use fileURLWithPath:, not URLWithString:. Apparently the difference is really, really significant.
thanks to Noah. https://stackoverflow.com/a/4201419/88461
I have a TabBarController with two tabs and I want to play music on both tabs. Right now I have my code on the main appDelegate
NSURL *url = [NSURL fileURLWithPath:[[NSBundle mainBundle]
pathForResource:#"My Song"
ofType:#"m4a"]]; // My Song.m4a
NSError *error;
self.audioPlayer = [[AVAudioPlayer alloc]
initWithContentsOfURL:url
error:&error];
if (error)
{
NSLog(#"Error in audioPlayer: %#",
[error localizedDescription]);
} else {
//audioPlayer.delegate = self;
[audioPlayer prepareToPlay];
}
but I'm getting the error Program received signal: "SIGABRT" on UIApplicationMain
Is there a better way to accomplish what I'm trying to do? If this is how I should do it, where do I start checking for problems?
yes you can use AVAudioPlayer in App Delegate.
What you need to do is:-
In appDelegate.h file do:-
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
AVAudioPlayer *_backgroundMusicPlayer;
BOOL _backgroundMusicPlaying;
BOOL _backgroundMusicInterrupted;
UInt32 _otherMusicIsPlaying;
Make backgroundMusicPlayer property and sythesize it.
In appDelegate.m file do:-
Add these lines in did FinishLaunching method
NSError *setCategoryError = nil;
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryAmbient error:&setCategoryError];
// Create audio player with background music
NSString *backgroundMusicPath = [[NSBundle mainBundle] pathForResource:#"SplashScreen" ofType:#"wav"];
NSURL *backgroundMusicURL = [NSURL fileURLWithPath:backgroundMusicPath];
NSError *error;
_backgroundMusicPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:backgroundMusicURL error:&error];
[_backgroundMusicPlayer setDelegate:self]; // We need this so we can restart after interruptions
[_backgroundMusicPlayer setNumberOfLoops:-1]; // Negative number means loop forever
Now implement delegate methods
#pragma mark -
#pragma mark AVAudioPlayer delegate methods
- (void) audioPlayerBeginInterruption: (AVAudioPlayer *) player {
_backgroundMusicInterrupted = YES;
_backgroundMusicPlaying = NO;
}
- (void) audioPlayerEndInterruption: (AVAudioPlayer *) player {
if (_backgroundMusicInterrupted) {
[self tryPlayMusic];
_backgroundMusicInterrupted = NO;
}
}
- (void)tryPlayMusic {
// Check to see if iPod music is already playing
UInt32 propertySize = sizeof(_otherMusicIsPlaying);
AudioSessionGetProperty(kAudioSessionProperty_OtherAudioIsPlaying, &propertySize, &_otherMusicIsPlaying);
// Play the music if no other music is playing and we aren't playing already
if (_otherMusicIsPlaying != 1 && !_backgroundMusicPlaying) {
[_backgroundMusicPlayer prepareToPlay];
if (soundsEnabled==YES) {
[_backgroundMusicPlayer play];
_backgroundMusicPlaying = YES;
}
}
}
Hello
I have 16 sounds in a view. And they loops etc. I want a rect button where you tap it all the sounds stop.
Here is the code i used for one of the sounds, its the same for the rest.#
- (IBAction)twoSound:(id)sender; {
if (twoAudio && twoAudio.playing) {
[twoAudio stop];
[twoAudio release];
twoAudio = nil;
return;
}
NSString *path = [[NSBundle mainBundle] pathForResource:#"2" ofType:#"wav"];
if (twoAudio) [twoAudio release];
NSError *error = nil;
twoAudio = [[AVAudioPlayer alloc] initWithContentsOfURL:[NSURL fileURLWithPath:path] error:&error];
if (error)
NSLog(#"%#",[error localizedDescription]);
twoAudio.delegate = self;
[twoAudio play];
}
I tried
-(IBAction)goStop:(id)sender; {
[oneAudio, twoAudio, threeAudio, fourAudio, fiveAudio, sixAudio, sevenAudio, eightAudio, nineAudio, tenAudio, elevenAudio, twelveAudio, thirteenAudio, fourteenAudio, fifthteenAudio, sixteenAudio stop];
}
But that didnt work.
Thanks
I think you have to use an NSArray instead of many sound objects. So you can easily fill the array with the 12 sounds and then you can use a "for" cycle to stop them all.