How to stop the auto focusing in UIimagepickercontroller camera in iPhone sdk?
set showsCameraControls to NO like bellow..
picker.showsCameraControls = NO;
Also see another answer from this link iphone-ios-4-3-camera-focus-square-removeable-programatically
You may want to try to lock the focus to disable auto-focus. Here is a sample code:
NSArray *devices = [AVCaptureDevice devices];
NSError *error;
for (AVCaptureDevice *device in devices) {
if (([device hasMediaType:AVMediaTypeVideo]) &&
([device position] == AVCaptureDevicePositionBack) ) {
[device lockForConfiguration:&error];
if ([device isFocusModeSupported:AVCaptureFocusModeLocked]) {
device.focusMode = AVCaptureFocusModeLocked;
NSLog(#"Focus locked");
}
[device unlockForConfiguration];
}
}
Also you can try setting the .showsCameraControls property of your picker controller to NO should remove the focus square (it did pre 4.3, I don't think anything has changed), but the downside is you'll need to provide your own controls (to take photos, etc).
Related
I use below code to access contacts in my iOS application. It was working fine in iOS<10 but with Xcode 8 and iOS 10 it crashes:
- (void)btcContacts_tap {
ABAddressBookRef addressBook = ABAddressBookCreateWithOptions(NULL, NULL);
ABAddressBookRequestAccessWithCompletion(addressBook, ^(bool granted, CFErrorRef error) {
if (granted) {
_addressBookController = [[ABPeoplePickerNavigationController alloc] init];
[[_addressBookController navigationBar] setBarStyle:UIBarStyleBlack];
_addressBookController.delegate = self;
[_addressBookController setPredicateForEnablingPerson:[NSPredicate predicateWithFormat:#"%K.#count > 0", ABPersonPhoneNumbersProperty]];
[_addressBookController setPeoplePickerDelegate:self];
[self presentViewController:_addressBookController animated:YES completion:nil];
}
else {
dispatch_async(dispatch_get_main_queue(), ^{
[self showMessage:NSLocalizedStringFromTable(#"PLEASE_GRANT_CONTACTS", LIApplicationLanguage(), nil) andAdvertise:#"" andService:nil andTransactionState:kTTTransactionStateInfo];
});
}
});
}
I have set NSSetUncaughtExceptionHandler to a method for logging the crash report but even the exception handler is not calling...
Does someone else faced this problem too?
iOS 10:
You need to put the NSContactsUsageDescription in your plist. Like:
<key>NSContactsUsageDescription</key>
<string>$(PRODUCT_NAME) uses photos</string>
See all usage descriptions here.
Use CNContactStore , ABAddressBookRequestAccessWithCompletion is depreciated.
enter link description here
According to https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/GameKit_Guide/LeaderBoards/LeaderBoards.html
Reporting score to gamecenter in ios7 should be done using
[GKLeaderboard reportScores:scores withCompletionHandler:^(NSError *error) {
//Do something interesting here.
}];
However, I could not find any reference to this method in GKLeaderboard.
The method does not exist here:
https://developer.apple.com/library/ios/documentation/GameKit/Reference/GKLeaderboard_Ref/Reference/Reference.html
GKLeaderboard.h does not contain a reportScores method also.
The former way of reporting score using GKScore's reportScoreWithCompletionHandler method had been deprecated so I am reluctant to use that.
Does anyone know whats the correct way to report score to gamecenter in ios7?
I can confirm that the reportScores:withCompletionHandler: method does work; I'm using it in one of my apps. It's located in the header file GKScore.h. This is how I'm using it:
- (void) reportHighScore:(NSInteger) highScore {
if ([GKLocalPlayer localPlayer].isAuthenticated) {
GKScore* score = [[GKScore alloc] initWithLeaderboardIdentifier:MY_LEADERBOARD_ID];
score.value = highScore;
[GKScore reportScores:#[score] withCompletionHandler:^(NSError *error) {
if (error) {
// handle error
}
}];
}
}
I have an NSButton in my preferences to interact with adding the application to the LoginItems. If the adding of the login item fails, I want to uncheck the box so the user doesn't get a false sense that it was added to the login items. However, after doing this, when I click the checkbox again, the bindings is not triggered.
- (void)addLoginItem:(BOOL)status
{
NSURL *url = [[[NSBundle mainBundle] bundleURL] URLByAppendingPathComponent:
#"Contents/Library/LoginItems/HelperApp.app"];
// Registering helper app
if (LSRegisterURL((__bridge CFURLRef)url, true) != noErr) {
NSLog(#"LSRegisterURL failed!");
}
if (!SMLoginItemSetEnabled((__bridge CFStringRef)[[NSBundle mainBundle] bundleIdentifier], (status) ? true : false)) {
NSLog(#"SMLoginItemSetEnabled failed!");
[self willChangeValueForKey:#"startAtLogin"];
[self.startAtLogin setValue:[NSNumber numberWithBool:[self automaticStartup]] forKey:#"state"];
[self didChangeValueForKey:#"startAtLogin"];
}
}
- (void)setAutomaticStartup:(BOOL)state
{
NSLog(#"Set automatic startup: %d", state);
if ([self respondsToSelector:#selector(addLoginItem:)]) {
[self addLoginItem:state];
}
}
- (BOOL)automaticStartup
{
BOOL isEnabled = NO;
// the easy and sane method (SMJobCopyDictionary) can pose problems when sandboxed. -_-
CFArrayRef cfJobDicts = SMCopyAllJobDictionaries(kSMDomainUserLaunchd);
NSArray* jobDicts = CFBridgingRelease(cfJobDicts);
if (jobDicts && [jobDicts count] > 0) {
for (NSDictionary* job in jobDicts) {
if ([[[NSBundle mainBundle] bundleIdentifier] isEqualToString:[job objectForKey:#"Label"]]) {
isEnabled = [[job objectForKey:#"OnDemand"] boolValue];
break;
}
}
}
NSLog(#"Is Enabled: %d", isEnabled);
// if (isEnabled != _enabled) {
[self willChangeValueForKey:#"startupEnabled"];
startupEnabled = isEnabled;
[self didChangeValueForKey:#"startupEnabled"];
// }
return isEnabled;
}
I have my databinding for the checkbox bound to self.automaticStartup. If I remove the line [self.startAtLogin setValue:[NSNumber numberWithBool:[self automaticStartup]] forKey:#"state"]; then the bindings work fine, but it doesn't uncheck, if the adding of the item fails.
How can I change this binding value programmatically so that every other binding event is not ignored?
From your explanation, your bound value is automaticStartup, but you are sending willChangeValueForKey: for startAtLogin. In order for bindings to work correctly, you need to alert on the change to the bound variable at some point. However, since you are in the midst of setAutomaticStartup: at the time, it's not really safe to do that here.
In this case, I would not use bindings to perform the change itself, I would consider the old-style IBAction mechanism and then set the checkbox value manually through an IBOutlet when you can confirm the status.
I created a 'mirror'-like view in my app that uses the front camera to show a 'mirror' to the user. The problem I'm having is that I have not touched this code in weeks (and it did work then) but now I'm testing it again and it's not working. The code is the same as before, there are no errors coming up, and the view in the storyboard is exactly the same as before. I have no idea what is going on, so I was hoping that this website would help.
Here is my code:
if([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront]) {
//If the front camera is available, show the camera
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureOutput *output = [[AVCaptureStillImageOutput alloc] init];
[session addOutput:output];
//Setup camera input
NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
//You could check for front or back camera here, but for simplicity just grab the first device
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
NSError *error = nil;
// create an input and add it to the session
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; //Handle errors
//set the session preset
session.sessionPreset = AVCaptureSessionPresetHigh; //Or other preset supported by the input device
[session addInput:input];
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
//Now you can add this layer to a view of your view controller
[cameraView.layer addSublayer:previewLayer];
previewLayer.frame = self.cameraView.bounds;
[session startRunning];
if ([session isRunning]) {
NSLog(#"The session is running");
}
if ([session isInterrupted]) {
NSLog(#"The session has been interupted");
}
} else {
//Tell the user they don't have a front facing camera
}
Thank You in advanced.
Not sure if this is the problem but there is an inconsistency between your code and the comments. The inconsistency is with the following line of code:
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
In the comment above it says: "...for simplicity just grab the first device". However, the code is grabbing the second device, NSArray is indexed from 0. I believe the comment should be corrected as I think you are assuming the front camera will be the second device in the array.
If you are working on the assumption that the first device is the back camera and the second device is the front camera then this is a dangerous assumption. It would be much safer and more future proof to check the list of possibleDevices for the device that is the front camera.
The following code will enumerate the list of possibleDevices and create input using the front camera.
// Find the front camera and create an input and add it to the session
AVCaptureDeviceInput* input = nil;
for(AVCaptureDevice *device in possibleDevices) {
if ([device position] == AVCaptureDevicePositionFront) {
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error]; //Handle errors
break;
}
}
Update: I have just cut and pasted the code exactly as it is in the question into a simple project and it is working fine for me. I am seeing the video from the front camera. You should probably look elsewhere for the issue. First, I would be inclined to check the cameraView and associated layers.
I wrote a simple iOS program to get number of photo images which are saved in the camera roll by using 'Assets Library' framework provided in the SDK4.2.
The program worked well as I expected when I ran it on the iPhone simulator.
But, it didn't retrieve any images when I ran on the 'real' iPhone device (iPhone 3GS with iOS 4.2.1).
This problem looks like as same as the problem discussed in the below article:
Assets Library Framework not working correctly on 4.0 and 4.2
So, I added the "dispatch_async(dispatch_get_main_queue()..." function as below, But I couldn't solve the problem.
- (void)viewDidLoad {
[super viewDidLoad];
NSMutableArray assets = [[NSMutableArray array] retain]; // Prepare array to have retrieved images by Assets Library.
void (^assetEnumerator)(struct ALAsset *, NSUInteger, BOOL *) = ^(ALAsset *asset, NSUInteger index, BOOL *stop) {
if(asset != NULL) {
[assets addObject:asset];
dispatch_async(dispatch_get_main_queue(), ^{
// show number of retrieved images saved in the Camera role.
// The [assets count] returns always 0 when I run this program on iPhone device although it worked OK on the simulator.
NSLog(#"%i", [assets count]);
});
}
};
void (^assetGroupEnumerator)(struct ALAssetsGroup *, BOOL *) = ^(ALAssetsGroup *group, BOOL *stop) {
if(group != nil) {
[group enumerateAssetsUsingBlock:assetEnumerator];
}
};
// Create instance of the Assets Library.
ALAssetsLibrary* library = [[ALAssetsLibrary alloc] init];
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos // Retrieve the images saved in the Camera role.
usingBlock:assetGroupEnumerator
failureBlock: ^(NSError *error) {
NSLog(#"Failed.");
}];
}
Could you please tell me if you have any ideas to solve it?
I have 1 update:
To get error code, I modified the failureBlock of the enumerateGroupsWithTypes as below, and then reproduced the symptom again.
Then, the app returned the error code -3311 (ALAssetsLibraryAccessUserDeniedError).
However I didn't any operation to deny while my reproducing test.
What's the possible cause of the err#=-3311?
[library enumerateGroupsWithTypes:ALAssetsGroupSavedPhotos
usingBlock:assetGroupEnumerator
failureBlock: ^(NSError *error) {
NSLog(#"Failed");
resultMsg = [NSString stringWithFormat:#"Failed: code=%d", [error code]]; }];
It is strange that location services should be involved when accessing saved photos. Maybe it has to do with geo-tagging information on the photos. Anyways Apple says that enabling location services is required when using enumerateGroupsWithTypes:usingBlock:failureBlock:
Special Considerations
This method will fail with error ALAssetsLibraryAccessGloballyDeniedError if the user has not enabled Location Services (in Settings > General)."