I spent my whole 2weeks for just trying to resolve this problem. So frustrate!
Following 2 functions are what I'm using for fetching a image from device library.
If I use "setImage" function multiple times I keep losing my free memory on my iOS Device.
I think "[imageFromAsset initWithCGImage:[[myasset defaultRepresentation] fullScreenImage]];" in assetImage function causes the problem.
Can any guys help me? Any clues or thinking would be SUPER appreciate! Please!
- (void)setImage:(NSURL *)imageURL{
UIImage *imageFromDeviceLibrary = [[UIImage alloc] init];
[[DevicePhotoControl sharedInstance] assetImage:[imageURL absoluteString] imageToStore:imageFromDeviceLibrary];
UIImageView *fullImageView = [[UIImageView alloc] initWithImage:imageFromDeviceLibrary];
[imageFromDeviceLibrary release];
[self.view addSubview:fullImageView];
[fullImageView release];
}
- (void)assetImage:(NSString *)assetURL imageToStore:(UIImage *)imageFromAsset{
// Handling exception case for when it doesn't have assets-library
if ([assetURL hasPrefix:#"assets-library:"] == NO) {
assetURL = [NSString stringWithFormat:#"%#%#",#"assets-library:",assetURL];
}
__block BOOL busy = YES;
ALAssetsLibrary* assetslibrary = [[[ALAssetsLibrary alloc] init] autorelease];
//get image data by URL
ALAssetsLibraryAssetForURLResultBlock resultblock = ^(ALAsset *myasset)
{
[imageFromAsset initWithCGImage:[[myasset defaultRepresentation] fullScreenImage]];
busy = NO;
};
ALAssetsLibraryAccessFailureBlock failureblock = ^(NSError *myerror)
{
NSLog(#"Library Image Fetching Failed : %#",[myerror localizedDescription]);
busy = NO;
};
[assetslibrary assetForURL:[NSURL URLWithString:assetURL]
resultBlock:resultblock
failureBlock:failureblock];
while (busy == YES) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
The moment the AssetLibrary is released, all the asset object will be gone with it.
I suggest that you create your assetLibrary in the app delegate to keep it alive and only reset it when you receive change notification ALAssetsLibraryChangedNotification from the ALAssetLibrary
here
It may help you.
Related
I have an iOS app with a simple UIView placed in the view controller. I am trying to show the camera feed of the front facing camera, in the UIView. I am not trying to take a picture or record a video, I simply want to show the live feed in a UIView.
I have tried to implement AVCaptureVideoPreviewLayer, however the feed I get is blank. Nothing seems to happen. Here is my code:
AVCaptureSession *session = [[AVCaptureSession alloc] init];
[session setSessionPreset:AVCaptureSessionPresetPhoto];
AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
NSError *error = nil;
AVCaptureDeviceInput *input;
#try {
input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
} #catch (NSException *exception) {
NSLog(#"Error; %#", error);
} #finally {
if (error == nil) {
if ([session canAddInput:input]) {
[session addInput:input];
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}];
if ([session canAddOutput:stillImageOutput]) {
[session setSessionPreset:AVCaptureSessionPresetHigh];
[session addOutput:stillImageOutput];
AVCaptureVideoPreviewLayer *previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
[previewLayer setVideoGravity:AVLayerVideoGravityResizeAspect];
[previewLayer.connection setVideoOrientation:AVCaptureVideoOrientationPortrait];
[backgroundStreamView.layer addSublayer:previewLayer];
[session startRunning];
NSLog(#"session running");
} else {
NSLog(#"cannot add output");
}
} else {
NSLog(#"cannot add inout");
}
} else {
NSLog(#"general error: %#", error);
}
}
The session runs perfectly fine, however no video feed is shown. What am I doing wrong?
Managed to fix it myself, turned out to be a fairly simple issue - I didn't specify the frame size of the AVCapturePreviewLayer and as a result it was not appearing (presumably because it defaults to a frame size of zero).
To fix this I set the frame to match the frame of my custom UIView:
[previewLayer setFrame:backgroundStreamView.bounds];
Deprecation code fix
AVCaptureStillImageOutput is also deprecated, so to fix that, I replaced it with the AVCapturePhotoOutput class. Thus the code changed from:
AVCaptureStillImageOutput *stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
[stillImageOutput setOutputSettings:#{AVVideoCodecKey : AVVideoCodecJPEG}]
to the following:
AVCapturePhotoOutput *stillImageOutput = [[AVCapturePhotoOutput alloc] init];
I'm wondering how to free memory in this simple program that plays a file through a buffer and then stops it.
-(void)setupAudioOne
{
NSError *error;
BOOL success = NO;
_player = [[AVAudioPlayerNode alloc] init];
NSURL *hiphopOneURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:#"Hip Hop 1" ofType:#"caf"]];
AVAudioFile *hiphopOneFile = [[AVAudioFile alloc] initForReading:hiphopOneURL error:&error];
_playerLoopBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:[hiphopOneFile processingFormat] frameCapacity:(AVAudioFrameCount)[hiphopOneFile length]];
success = [hiphopOneFile readIntoBuffer:_playerLoopBuffer error:&error];
_engine = [[AVAudioEngine alloc] init];
[_engine attachNode:_player];
AVAudioMixerNode *mainMixer = [_engine mainMixerNode];
AVAudioFormat *stereoFormat = [[AVAudioFormat alloc] initStandardFormatWithSampleRate:44100 channels:2];
[_engine connect:_player to:mainMixer fromBus:0 toBus:0 format:stereoFormat];
[self startEngine];
}
Above is the general setup of the engine and the player node.
Then we implement the player with a play button:
- (IBAction)play:(id)sender {
if (!self.playerIsPlaying)
{
[self setupAudioOne];
[_player scheduleBuffer:_playerLoopBuffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:nil];
[_player play];
}
}
And finally we stop the player with a stop button:
- (IBAction)stopHipHopOne:(id)sender {
if (self.playerIsPlaying) {
[_player stop];
}
playerIsPlaying is just a simple BOOL that determines if the _player is playing.
So basically my question is, when you hit the stop button as this program is written now, no memory will be freed.
Surely there is a simple line of code that I can add to the stop button that frees up the memory the engine and player is using?
Any thoughts?
Yes, there is. After stopping the player node, you can call:
[_engine disconnectNodeInput:_player];
[_engine detachNode:_player];
I saw you're also keeping a reference to the audio buffer, so you might want to nil that one as well. Let me know if that doesn't work for you. Something else could be leaking.
I am developing an iPhone App. In that, there is a requirement for Pausing and resuming the camera. So i used AVFoundation for that instead of using UIImagePickerController.
My code is :
- (void) startup :(BOOL)isFrontCamera
{
if (_session == nil)
{
NSLog(#"Starting up server");
self.isCapturing = NO;
self.isPaused = NO;
_currentFile = 0;
_discont = NO;
// create capture device with video input
_session = [[AVCaptureSession alloc] init];
AVCaptureDevice *cameraDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(isFrontCamera)
{
NSArray *videoDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
AVCaptureDevice *captureDevice = nil;
for (AVCaptureDevice *device in videoDevices)
{
if (device.position == AVCaptureDevicePositionFront)
{
captureDevice = device;
break;
}
}
cameraDevice = captureDevice;
}
cameraDevice=[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:cameraDevice error:nil];
[_session addInput:input];
// audio input from default mic
AVCaptureDevice* mic = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
AVCaptureDeviceInput* micinput = [AVCaptureDeviceInput deviceInputWithDevice:mic error:nil];
[_session addInput:micinput];
// create an output for YUV output with self as delegate
_captureQueue = dispatch_queue_create("uk.co.gdcl.cameraengine.capture", DISPATCH_QUEUE_SERIAL);
AVCaptureVideoDataOutput* videoout = [[AVCaptureVideoDataOutput alloc] init];
[videoout setSampleBufferDelegate:self queue:_captureQueue];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys:
[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey,
nil];
videoout.videoSettings = setcapSettings;
[_session addOutput:videoout];
_videoConnection = [videoout connectionWithMediaType:AVMediaTypeVideo];
[_videoConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
NSDictionary* actual = videoout.videoSettings;
_cy = [[actual objectForKey:#"Width"] integerValue];
_cx = [[actual objectForKey:#"Height"] integerValue];
AVCaptureAudioDataOutput* audioout = [[AVCaptureAudioDataOutput alloc] init];
[audioout setSampleBufferDelegate:self queue:_captureQueue];
[_session addOutput:audioout];
_audioConnection = [audioout connectionWithMediaType:AVMediaTypeAudio];
[_session startRunning];
_preview = [AVCaptureVideoPreviewLayer layerWithSession:_session];
_preview.videoGravity = AVLayerVideoGravityResizeAspectFill;
}
}
Here i am facing the problem when i change the camera to Front. when i calling the above method by changing the camera to front, the preview layer is getting stuck and no preview is coming. My doubt is "Can we change the capture device in the middle of capture session ?". Please guide me where i am going wrong (or) Suggest me with a solution on how to navigate between front and back camera while recording.
Thanks in Advance.
Yes, you can. There are just a few of things you need to cater to.
Need to be using AVCaptureVideoDataOutput and its delegate for recording.
Make sure you remove the previous deviceInput before adding the new deviceInput.
You must remove and recreate the AVCaptureVideoDataOutput as well.
I am using these two functions for it right now and it works while the session is running.
- (void)configureVideoWithDevice:(AVCaptureDevice *)camera {
[_session beginConfiguration];
[_session removeInput:_videoInputDevice];
_videoInputDevice = nil;
_videoInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:camera error:nil];
if ([_session canAddInput:_videoInputDevice]) {
[_session addInput:_videoInputDevice];
}
[_session removeOutput:_videoDataOutput];
_videoDataOutput = nil;
_videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
[_videoDataOutput setSampleBufferDelegate:self queue:_outputQueueVideo];
NSDictionary* setcapSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange], kCVPixelBufferPixelFormatTypeKey, nil];
_videoDataOutput.videoSettings = setcapSettings;
[_session addOutput:_videoDataOutput];
_videoConnection = [_videoDataOutput connectionWithMediaType:AVMediaTypeVideo];
if([_videoConnection isVideoOrientationSupported]) {
[_videoConnection setVideoOrientation:AVCaptureVideoOrientationLandscapeRight];
}
[_session commitConfiguration];
}
- (void)configureAudioWithDevice:(AVCaptureDevice *)microphone {
[_session beginConfiguration];
_audioInputDevice = [AVCaptureDeviceInput deviceInputWithDevice:microphone error:nil];
if ([_session canAddInput:_audioInputDevice]) {
[_session addInput:_audioInputDevice];
}
[_session removeOutput:_audioDataOutput];
_audioDataOutput = nil;
_audioDataOutput = [[AVCaptureAudioDataOutput alloc] init];
[_audioDataOutput setSampleBufferDelegate:self queue:_outputQueueAudio];
[_session addOutput:_audioDataOutput];
_audioConnection = [_audioDataOutput connectionWithMediaType:AVMediaTypeAudio];
[_session commitConfiguration];
}
You can't change the captureDevice mid-session. And you can only have one capture session running at a time. You could end the current session and create a new one. There will be a slight lag (maybe a second or two depending on your cpu load).
I wish Apple would allow multiple sessions or at least multiple devices per session... but they do not... yet.
have you considered having multiple sessions and then afterwards processing the video files to join them together into one?
I'm working over application, that using zxing library to read QRcodes. I have problem with ZxingWidgetController - when view is showed, during application is in background/not active (eg. screen is lock) image from camera is not shown on screen - only background is visible, and scanner seems to be not working.
when i call initCapture method again, after a little delay video from camera is showed, but in this case, every time when application lose activity i need to reinitialize scanner - this behavior is not comfortable at all.
this bug can be repeated on almost all aplication used zXing, so i suppose that is some zXing bug.
zXing initCapture method code is:
- (void)initCapture {
#if HAS_AVFF
AVCaptureDeviceInput *captureInput =
[AVCaptureDeviceInput deviceInputWithDevice:
[AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo]
error:nil];
if(!captureInput)
{
NSLog(#"ERROR - CaptureInputNotInitialized");
}
AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
captureOutput.alwaysDiscardsLateVideoFrames = YES;
if(!captureOutput)
{
NSLog(#"ERROR - CaptureOutputNotInitialized");
}
[captureOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey;
NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA];
NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key];
[captureOutput setVideoSettings:videoSettings];
self.captureSession = [[[AVCaptureSession alloc] init] autorelease];
self.captureSession.sessionPreset = AVCaptureSessionPresetMedium; // 480x360 on a 4
if([self.captureSession canAddInput:captureInput])
{
[self.captureSession addInput:captureInput];
}
else
{
NSLog(#"ERROR - cannot add input");
}
if([self.captureSession canAddOutput:captureOutput])
{
[self.captureSession addOutput:captureOutput];
}
else
{
NSLog(#"ERROR - cannot add output");
}
[captureOutput release];
if (!self.prevLayer)
{
[self.prevLayer release];
}
self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
// NSLog(#"prev %p %#", self.prevLayer, self.prevLayer);
self.prevLayer.frame = self.view.bounds;
self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer: self.prevLayer];
[self.captureSession startRunning];
#endif
}
Maybe you guys know what is wrong?
I dont understand your question. If application is in background/not active, of course it cant working. You should make it clear.
I am trying to create a method that will return me a ALAsset for a given asset url. (I need upload the asset later and want to do it outside the result block with the result.)
+ (ALAsset*) assetForPhoto:(Photo*)photo
{
ALAssetsLibrary* library = [[[ALAssetsLibrary alloc] init] autorelease];
__block ALAsset* assetToReturn = nil;
NSURL* url = [NSURL URLWithString:photo.assetUrl];
NSLog(#"assetForPhoto: %#[", url);
[library assetForURL:url resultBlock:^(ALAsset *asset)
{
NSLog(#"asset: %#", asset);
assetToReturn = asset;
NSLog(#"asset: %# %d", assetToReturn, [assetToReturn retainCount]);
} failureBlock:^(NSError *error)
{
assetToReturn = nil;
}];
NSLog(#"assetForPhoto: %#]", url);
NSLog(#"assetToReturn: %#", assetToReturn); // Invalid access exception coming here.
return assetToReturn;
}
The problem is assetToReturn gives an EXC_BAD_ACCESS.
Is there some problem if I try to assign pointers from inside the block? I saw some examples of blocks but they are always with simple types like integers etc.
A few things:
You must keep the ALAssetsLibrary instance around that created the ALAsset for as long as you use the asset.
You must register an observer for the ALAssetsLibraryChangedNotification, when that is received any ALAssets you have and any other AssetsLibrary objects will need to be refetched as they will no longer be valid. This can happen at any time.
You shouldn't expect the -assetForURL:resultBlock:failureBlock:, or any of the AssetsLibrary methods with a failureBlock: to be synchronous. They may need to prompt the user for access to the library and will not always have their blocks executed immediately. It's better to put actions that need to happen on success in the success block itself.
Only if you absolutely must make this method synchronous in your app (which I'd advise you to not do), you'll need to wait on a semaphore after calling assetForURL:resultBlock:failureBlock: and optionally spin the runloop if you end up blocking the main thread.
The following implementation should satisfy as a synchronous call under all situations, but really, you should try very hard to make your code asynchronous instead.
- (ALAsset *)assetForURL:(NSURL *)url {
__block ALAsset *result = nil;
__block NSError *assetError = nil;
dispatch_semaphore_t sema = dispatch_semaphore_create(0);
[[self assetsLibrary] assetForURL:url resultBlock:^(ALAsset *asset) {
result = [asset retain];
dispatch_semaphore_signal(sema);
} failureBlock:^(NSError *error) {
assetError = [error retain];
dispatch_semaphore_signal(sema);
}];
if ([NSThread isMainThread]) {
while (!result && !assetError) {
[[NSRunLoop currentRunLoop] runMode:NSDefaultRunLoopMode beforeDate:[NSDate distantFuture]];
}
}
else {
dispatch_semaphore_wait(sema, DISPATCH_TIME_FOREVER);
}
dispatch_release(sema);
[assetError release];
return [result autorelease];
}
You should retain and autorelease the asset:
// ...
assetToReturn = [asset retain];
// ...
return [assetToReturn autorelease];