I have a list of videoInput devices.
First I add the default input device for recording set up the compression, the output and everythin needed. After that I'm trying to change the input device, but somehow, the view stop working and it shows only black screen. I'm trying to change when I'm not recording, only showing the input in a view.
Here is the code for changing:
-(void) changeVideoInput:(QTCaptureDevice *)videoDevice{
BOOL success = NO;
NSError *error;
[mCaptureSession stopRunning];
[mCaptureSession removeInput:mCaptureVideoDeviceInput]; //current input
[[mCaptureVideoDeviceInput device] close];
success = [videoDevice open:&error];
mCaptureVideoDeviceInput2 = [[QTCaptureDeviceInput alloc] initWithDevice:videoDevice]; //new input
success = [mCaptureSession addInput:mCaptureVideoDeviceInput2 error:&error];
[mCaptureSession startRunning];
}
Finally I figured out, that the code is working. The problem causes the opening of the default video device (It's name is: "Blackmagic").
Do not open that device. It is a universal device.
Related
I'm trying to use MPMoviePlayerController to create an audio player without having to implement my own scrubbing and play/pause button.
I have code that records audio into NSData and saves it to disk. The method below tests audio by playing it with the MPMoviePlayerController.
The code below works (plays audio, it is heard) if I execute the method immediately after recording is done. It also works if I press home button, then return to the app.
However, when I kill the app and restart, or hit "run" from xCode, I do not hear any audio. Here are the symptoms:
The code below lists that the NSData exists on disk and has length
The path to NSData is the same both times
NSData is kAudioFormatMPEG4AAC format
The media player displays correct duration
Media player's scrubber moves from start to finish
Speaker volume is set to maximum in both cases.
No audio is heard after the app was killed and restarted.
What could be causing my MPMoviePlayerController to not provide any audio upon app relaunch? I'm writing the audio length into the file's extended attributes, could this be messing with the "Playability" of the file?
-(void)testPlayback:(AudioNote*)note
{
NSString* path = [note filepath];
if(note == nil || path.length == 0)
{
return;
}
NSURL* url = [NSURL fileURLWithPath:path];
//file exists, and data exists on disk in both cases
NSString* exists = ([note fileExists]? #"YES":#"NO");
NSUInteger length = note.fileData.length;
DLog(#"Playing note (exists: %#, data length:%i), duration: %.2f",exists,length,note.durationSeconds);
self.moviePlayer=[[MPMoviePlayerController alloc] initWithContentURL:url];
self.moviePlayer.controlStyle=MPMovieControlStyleDefault;
[self.view addSubview:self.moviePlayer.view];
[self.moviePlayer prepareToPlay];
[self.moviePlayer play];
}
I am using the iPhone/iPad camera to get a video stream and doing recognition on the stream, but with lighting changes it has a negative impact on the robustness. I have tested different settings in different light and can get it to work, but trying to get the settings to adjust at run time is what I need.
I can calculate a simple brightness check on each frame, but the camera adjusts and throws my results off. I can watch for sharp changes and run checks then, but gradual changes would throw my results off as well.
Ideally I'd be like to access the camera/EXIF data for the stream and see what it is registering the unfiltered brightness as, is there a way to do this?
(I am working for devices iOS 5 and above)
Thank you
Available in iOS 4.0 and above. It's possible to get EXIF information from CMSampleBufferRef.
//Import ImageIO & include framework in your project.
#import <ImageIO/CGImageProperties.h>
In your sample buffer delegate toll-free bridging will get a NSDictionary of results from CoreMedia's CMGetAttachment.
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
NSDictionary* dict = (NSDictionary*)CMGetAttachment(sampleBuffer, kCGImagePropertyExifDictionary, NULL);
Complete code, as used in my own app:
- (void)setupAVCapture {
//-- Setup Capture Session.
_session = [[AVCaptureSession alloc] init];
[_session beginConfiguration];
//-- Set preset session size.
[_session setSessionPreset:AVCaptureSessionPreset1920x1080];
//-- Creata a video device and input from that Device. Add the input to the capture session.
AVCaptureDevice * videoDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
if(videoDevice == nil)
assert(0);
//-- Add the device to the session.
NSError *error;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:&error];
if(error)
assert(0);
[_session addInput:input];
//-- Create the output for the capture session.
AVCaptureVideoDataOutput * dataOutput = [[AVCaptureVideoDataOutput alloc] init];
[dataOutput setAlwaysDiscardsLateVideoFrames:YES]; // Probably want to set this to NO when recording
//-- Set to YUV420.
[dataOutput setVideoSettings:[NSDictionary dictionaryWithObject:[NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarFullRange]
forKey:(id)kCVPixelBufferPixelFormatTypeKey]]; // Necessary for manual preview
// Set dispatch to be on the main thread so OpenGL can do things with the data
[dataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()];
[_session addOutput:dataOutput];
[_session commitConfiguration];
[_session startRunning];
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
CFDictionaryRef metadataDict = CMCopyDictionaryOfAttachments(NULL,
sampleBuffer, kCMAttachmentMode_ShouldPropagate);
NSDictionary *metadata = [[NSMutableDictionary alloc]
initWithDictionary:(__bridge NSDictionary*)metadataDict];
CFRelease(metadataDict);
NSDictionary *exifMetadata = [[metadata
objectForKey:(NSString *)kCGImagePropertyExifDictionary] mutableCopy];
self.autoBrightness = [[exifMetadata
objectForKey:(NSString *)kCGImagePropertyExifBrightnessValue] floatValue];
float oldMin = -4.639957; // dark
float oldMax = 4.639957; // light
if (self.autoBrightness > oldMax) oldMax = self.autoBrightness; // adjust oldMax if brighter than expected oldMax
self.lumaThreshold = ((self.autoBrightness - oldMin) * ((3.0 - 1.0) / (oldMax - oldMin))) + 1.0;
NSLog(#"brightnessValue %f", self.autoBrightness);
NSLog(#"lumaThreshold %f", self.lumaThreshold);
}
The lumaThreshold variable is sent as a uniform variable to my fragment shader, which multiplies the Y sampler texture to find the ideal luminosity based on the brightness of the environment. Right now, it uses the back camera; I'll probably switch to the front camera, since I'm only changing the "brightness" of the screen to adjust for indoor/outdoor viewing, and the user's eyes are on the front of the camera (and not the back).
I created a 'mirror'-like view in my app that uses the front camera to show a 'mirror' to the user. The problem I'm having is that I have not touched this code in weeks (and it did work then) but now I'm testing it again and it's not working. The code is the same as before, there are no errors coming up, and the view in the storyboard is exactly the same as before. I have no idea what is going on, so I was hoping that this website would help.
Here is my code:
if([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront]) {
//If the front camera is available, show the camera
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureOutput *output = [[AVCaptureStillImageOutput alloc] init];
[session addOutput:output];
//Setup camera input
NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
//You could check for front or back camera here, but for simplicity just grab the first device
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
NSError *error = nil;
// create an input and add it to the session
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; //Handle errors
//set the session preset
session.sessionPreset = AVCaptureSessionPresetHigh; //Or other preset supported by the input device
[session addInput:input];
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
//Now you can add this layer to a view of your view controller
[cameraView.layer addSublayer:previewLayer];
previewLayer.frame = self.cameraView.bounds;
[session startRunning];
if ([session isRunning]) {
NSLog(#"The session is running");
}
if ([session isInterrupted]) {
NSLog(#"The session has been interupted");
}
} else {
//Tell the user they don't have a front facing camera
}
Thank You in advanced.
Not sure if this is the problem but there is an inconsistency between your code and the comments. The inconsistency is with the following line of code:
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
In the comment above it says: "...for simplicity just grab the first device". However, the code is grabbing the second device, NSArray is indexed from 0. I believe the comment should be corrected as I think you are assuming the front camera will be the second device in the array.
If you are working on the assumption that the first device is the back camera and the second device is the front camera then this is a dangerous assumption. It would be much safer and more future proof to check the list of possibleDevices for the device that is the front camera.
The following code will enumerate the list of possibleDevices and create input using the front camera.
// Find the front camera and create an input and add it to the session
AVCaptureDeviceInput* input = nil;
for(AVCaptureDevice *device in possibleDevices) {
if ([device position] == AVCaptureDevicePositionFront) {
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error]; //Handle errors
break;
}
}
Update: I have just cut and pasted the code exactly as it is in the question into a simple project and it is working fine for me. I am seeing the video from the front camera. You should probably look elsewhere for the issue. First, I would be inclined to check the cameraView and associated layers.
I am using ELC Image Picker in my project. Here i am getting one issue that is:
when i selected images like 20 picker is working fine but when I select images like 32(selected images count) my app is crashing before dismissal of controller itself and I am getting the error:
Program received signal: “0”. Data Formatters temporarily
unavailable, will re-try after a 'continue'. (Unknown error loading
shared library "/Developer/usr/lib/libXcodeDebuggerSupport.dylib")
And also I am getting:
Received memory warning. Level=1
NOTE: when this situation is happened is, first i selected 32 images worked fine and again I selected same number of images it was crashing.
Also I've tried with the example: github ELCImagePickerController project.
Can any one give me the answer to over come this?
From error you can see that its a memory issue
So you have 2 options
set a limit for number of images can be choosed
in background save images to temp folder
OR
Customize ELC picker code so that...when a person selects an image... it will take only image path but not image content
and when they are done... now run a loop to get those images into your app.
#SteveGear following code will solve your problem. Just provide the UIImagePickerControllerReferenceURL and you will get the NSData. Its long time but still, it may help others.
ALAssetsLibrary *assetLibrary=[[ALAssetsLibrary alloc] init];
NSURL *assetURL = [infoObject objectForKey:UIImagePickerControllerReferenceURL];
__block NSData *assetData;
[assetLibrary assetForURL:assetURL resultBlock:^(ALAsset *asset) // substitute assetURL with your url
{
ALAssetRepresentation *rep = [asset defaultRepresentation];
Byte *buffer = (Byte*)malloc((long)rep.size);
NSUInteger buffered = [rep getBytes:buffer fromOffset:0.0 length:(NSUInteger)rep.size error:nil];
assetData = [NSData dataWithBytesNoCopy:buffer length:buffered freeWhenDone:YES];//this is NSData what you need.
//[data writeToFile:assetData atomically:YES]; //Uncomment this if you want to store the data as file.
}
failureBlock:^(NSError *err) {
NSLog(#"Error: %#",[err localizedDescription]);
}];
Here assetData is what you need.
I'm writing an iOS App using an AudioQueue for recording. I create an input queue configured to get linear PCM, stated this queue and everything works as expected.
To manage interruptions, I implemented the delegate methods of AVAudioSession to catch the begin and the end of an interruption. The method endInterruption looks like the following:
- (void)endInterruptionWithFlags:(NSUInteger)flags;
{
if (flags == AVAudioSessionInterruptionFlags_ShouldResume && audioQueue != 0) {
NSLog(#"Current audio session - category: '%#' mode: '%#'",
[[AVAudioSession sharedInstance] category],
[[AVAudioSession sharedInstance] mode]);
NSError *error = nil;
OSStatus errorStatus;
if ((errorStatus = AudioSessionSetActive(true)) != noErr) {
error = [self errorForAudioSessionServiceWithOSStatus:errorStatus];
NSLog(#"Could not reactivate the audio session: %#",
[error localizedDescription]);
} else {
if ((errorStatus = AudioQueueStart(audioQueue, NULL)) != noErr) {
error = [self errorForAudioQueueServiceWithOSStatus:errorStatus];
NSLog(#"Could not restart the audio queue: %#",
[error localizedDescription]);
}
}
}
// ...
}
If the app gets interrupted while it is in foreground, everything works correct. The problem appears, if the interruption happens in the background. Activating the audio session result in the error !cat:
The specified audio session category cannot be used for the attempted audio operation. For example, you attempted to play or record audio with the audio session category set to kAudioSessionCategory_AudioProcessing.
Starting the queue without activating the session results in the error code: -12985
At that point the category is set to AVAudioSessionCategoryPlayAndRecord and the mode is AVAudioSessionModeDefault.
I couldn't find any documentation for this error message, nor if it is possible to restart an input audio queue in the background.
Yes it is possible, but to reactivate the session in the background, the audio session has to either set AudioSessionProperty kAudioSessionProperty_OverrideCategoryMixWithOthers
OSStatus propertySetError = 0;
UInt32 allowMixing = true;
propertySetError = AudioSessionSetProperty (
kAudioSessionProperty_OverrideCategoryMixWithOthers,
sizeof (allowMixing),
&allowMixing
);
or the app has to receive remote control command events:
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
At the present there is no way to reactivate if you are in the background.
Have you made your app support backgrounding in the info.plist? I'm not sure if recording is possible in the background, but you probably need to add "Required Background Modes" and then a value in that array of "App plays audio"
Update I just checked and recording in the background is possible.