I am developing an app for our local business. I already have the live camera in a UIImageView, now I need to know how to read QR codes from the UIImageView and display the content (0000-KKP0-2013) in a label.
So basically I need a QR code scanner which is reading a QR code and save the content in a String. I already used ZXing ("Zebra Crossing") but it is not compatible with iOS 6 and it won't work. Is there an easy code for getting the QR Code content in a String?
Thank you!
This is the code I am using in my .m file:
#import "ZBarSDK.h"
#interface ViewController ()
#end
#implementation ViewController
#synthesize vImagePreview;
- (void)viewDidUnload
{
[super viewDidUnload];
vImagePreview = nil;
}
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
//----- SHOW LIVE CAMERA PREVIEW -----
AVCaptureSession *session = [[AVCaptureSession alloc] init];
session.sessionPreset = AVCaptureSessionPreset352x288;
/*CALayer *viewLayer = self.vImagePreview.layer;
NSLog(#"viewLayer = %#", viewLayer);*/
AVCaptureVideoPreviewLayer *captureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:session];
captureVideoPreviewLayer.frame = self.vImagePreview.bounds;
[self.vImagePreview.layer addSublayer:captureVideoPreviewLayer];
AVCaptureDevice *device = [self frontCamera];
NSError *error = nil;
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (!input) {
// Handle the error appropriately.
NSLog(#"ERROR: trying to open camera: %#", error);
UIAlertView *alert = [[UIAlertView alloc]
initWithTitle:#"QReader"
message:[NSString stringWithFormat:#"ERROR: Versuch die Kamera zu öffnen ist fehlgeschlagen [%#]",error]
delegate:self cancelButtonTitle:#"OK" otherButtonTitles:nil];
alert.tag = 1;
[alert show];
}
[session addInput:input];
[session startRunning];
}
- (AVCaptureDevice *)frontCamera {
NSArray *devices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for (AVCaptureDevice *device in devices) {
if ([device position] == AVCaptureDevicePositionFront) {
return device;
}
}
return nil;
}
Now I need to know how to read the QR code from the vImagePreview with the ZBarSDK. And I cant use a UIPickerView
Try ZBar: http://zbar.sourceforge.net/iphone/sdkdoc/install.html
We are using it successfully in our application which supports iOS 4 up to iOS 6.1
In my case I use ZBarReaderView - to see a camera preview, which automatically detects and returns scanned code.
You'll need:
#import "ZBarSDK.h"
ZBarReaderView *readerView;
add this : <ZBarReaderViewDelegate>
and then:
[readerView.scanner setSymbology:ZBAR_QRCODE config:ZBAR_CFG_ENABLE to:0];
readerView.readerDelegate = self;
[readerView start];
- (void)readerView:(ZBarReaderView *)view didReadSymbols: (ZBarSymbolSet *)syms fromImage:(UIImage *)img
{
for(ZBarSymbol *sym in syms)
{
NSLog(#"Did read symbols: %#", sym.data);
}
}
Anyways, just follow these instructions:
http://zbar.sourceforge.net/iphone/sdkdoc/tutorial.html
and then try it out - see if it works for You.
EDIT
Here I uploaded example project I took from here: https://github.com/arciem/ZBarSDK
It has enabled front facing camera. Tested - successfully reads qr code using front facing camera:
http://www.speedyshare.com/fkvqt/download/readertest.zip
or
Once application starts - front camera is shown - scanner is 200x200 large and as a subview.
http://www.speedyshare.com/QZZU5/download/ReaderSample-v3.zip
We looked into this not long ago. ZBar looks good, but it's LGPL-licensed, which is not suitable for use on the App Store. In the end I went with ZXingObjC.
if you want to test the qr codes here are some apps for iphone that might come in handy. iphone qr scanner
OP was looking for something that supported iOS6 two years ago, but for anyone else coming along, this one that I went with wraps the built-in iOS7 functionality:
https://github.com/mikebuss/MTBBarcodeScanner
Anyone looking to implement this in Swift. Check this out:
https://github.com/aeieli/swiftQRCode
Need to change a few syntax errors, otherwise fully working on iOS 8.1
Check this out with apple natively implemented Qr code
Related
I do not have a working version or idea how to get photos from your albums via PHPhotoLibrary or ALAssetsLibrary and install them in uicollectionviewcell like in instagram. I'm new to objective c :)
not install - add
I'm not sure what you mean by 'install to uicollectionviewcell' but here's how you would get photos from your album.
Ask for authorization
Show a UIImagePickerController (which is the default image picker view apple created for us)
Implement delegate methods to handle the picked image from the UIImagePickerController
Code would look as follows
Import statement:
#import <Photos/Photos.h>
Instantiate the image picker controller to show:
UIImagePickerController* imagePicker = [[UIImagePickerController alloc]init];
// Check if image access is authorized
if([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypePhotoLibrary]) {
imagePicker.sourceType = UIImagePickerControllerSourceTypePhotoLibrary;
// Use delegate methods to get result of photo library -- Look up UIImagePicker delegate methods
imagePicker.delegate = self;
[self presentViewController:imagePicker animated:true completion:nil];
}
I believe this would usually prompt the user for access to the photo library, but it's always good practice to handle all cases of authorization prior to simply trying to show the imagepicker.
Ask for authorization:
PHAuthorizationStatus status = [PHPhotoLibrary authorizationStatus];
if(status == PHAuthorizationStatusNotDetermined) {
// Request photo authorization
[PHPhotoLibrary requestAuthorization:^(PHAuthorizationStatus status) {
// User code (show imagepicker)
}];
} else if (status == PHAuthorizationStatusAuthorized) {
// User code
} else if (status == PHAuthorizationStatusRestricted) {
// User code
} else if (status == PHAuthorizationStatusDenied) {
// User code
}
Finally implement delegate methods:
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
UIImage *image = info[UIImagePickerControllerOriginalImage];
// Do something with picked image
}
After you pick your image, you can add it to a newly instantiated UICollectionViewController. This is probably a whole other question and you would be better off reading documentation for this.
Note that this was introduced in IOS8(I think?), but the AL route code will be similar to the above.
Added
The photo library is just another database, but you still have to ask the user for authorization.
The steps would be as follows:
Ask for authorization
Retrieve photos from photo library
I haven't done #2 myself but there seems to be a way to do it.
Get all of the pictures from an iPhone photoLibrary in an array using AssetsLibrary framework?
From a cursory look, it seems like this is an asynchronous function so you should code accordingly (I would call the requestImage function inside each UICollectionViewCell if I were you).
Leave a comment if you run into any trouble.
Good Luck!
You need to declare delegate UIImagePickerControllerDelegate
like this..
hope its helps you a lot...
- (IBAction)captureImagehandler:(id)sender
{
if (! [UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera]) {
UIAlertView *deviceNotFoundAlert = [[UIAlertView alloc] initWithTitle:#"No Device" message:#"Camera is not available"
delegate:nil
cancelButtonTitle:#"exit"
otherButtonTitles:nil];
[deviceNotFoundAlert show];
}
else
{
UIImagePickerController *cameraPicker = [[UIImagePickerController alloc] init];
cameraPicker.sourceType = UIImagePickerControllerSourceTypeCamera;
cameraPicker.delegate =self;
// Show image picker
[self presentViewController:cameraPicker animated:YES completion:nil];
}
}
I am new to the world of iBeacons and Objective-C so please bear with me
I have a very simple app built now that updates a Label when the device approaches an iBeacon. What is the Objective-C code sending the device to an external URL?
For example, as you approach the iBeacon, the device is taken to a Youtube video with an explanation of what you ware seeing.
Any tips are greatly appreciated.
Here is code that will do what you want, but it will only work if the app is in the foreground:
CLLocationManager *_locationManager;
- (void)viewDidLoad
{
[super viewDidLoad];
_locationManager = [[CLLocationManager alloc] init];
CLBeaconRegion *region = [[CLBeaconRegion alloc] initWithProximityUUID:[[NSUUID alloc] initWithUUIDString:#"2F234454-CF6D-4A0F-ADF2-F4911BA9FFA6"] identifier: #"any-radius-networks-ibeacon-default-uuid"];
[_locationManager startMonitoringForRegion:region];
_locationManager.delegate = self;
}
- (void)locationManager:(CLLocationManager *)manager didEnterRegion:(CLRegion *)region
{
[[UIApplication sharedApplication] openURL:[NSURL URLWithString:#"http://www.radiusnetworks.com/radbeacon/"]];
}
You'd have to add extra logic to make it open special URLs for special iBeacons.
I am getting a really weird error MFMailCompseViewController. The error is "error: address doesn't contain a section that points to a section in a object file". The app crashes after MFMailCompseViewController dismisses and the email actually get sent.
This is specific to MFMailComposeViewController as I have tried to present a plain view controller modally and it dismisses fine.
Here is the code I wrote to calland present mail composer:
- (void) emailImage:(UIImage *)img {
//verified that the image is being returned correctly
UIImage *img1 = [[_delegate photoBrowser:self photoAtIndex:0] underlyingImage];
MFMailComposeViewController *mfViewController = [[MFMailComposeViewController alloc] init];
mfViewController.mailComposeDelegate = self;
NSString *subject = #"Check out this photo I took - Cap That App";
[mfViewController setSubject:subject];
NSData *imgData = UIImageJPEGRepresentation(img1, 1.0);
[mfViewController addAttachmentData:imgData mimeType:#"image/jpg" fileName:#"photo.jpg"];
NSString *contactMessage = #"\n\nSent via Cap That - Available in the Apple App Store";
[mfViewController setMessageBody:contactMessage isHTML:YES];
[self presentViewController:mfViewController animated:YES completion:nil];
}
- (void)mailComposeController:(MFMailComposeViewController*)controller didFinishWithResult:(MFMailComposeResult)result error:(NSError*)error {
UIAlertView *alert = [[[UIAlertView alloc] initWithTitle:#"Status:" message:#"" delegate:nil cancelButtonTitle:#"ok" otherButtonTitles:nil] autorelease];
switch (result) {
case MFMailComposeResultCancelled:
alert.message = #"You chose not to send the email.";
break;
case MFMailComposeResultSaved:
alert.message = #"Your email was saved as a draft. It has not been sent yet.";
break;
case MFMailComposeResultSent:
alert.message = #"Your email has been sent!";
break;
case MFMailComposeResultFailed:
alert.message = #"There was an error sending the email. Please verify your email is working and try again.";
break;
default:
alert.message = #"You chose not to send the email.";
break;
}
[self dismissViewControllerAnimated:YES completion:^(void) {
[alert show];
}];
}
Thanks in advance for anyone's help on this.
I'm getting the same error in my app, acting on a tap gesture on a HUD. My gesture recognizer method is using a block property on the HUD to perform the needful actions, and there's where it's crashing (the code within the block never gets to run). Apparently the program cannot access that code, and since you also have a completion block that might be a clue to what's happening.
I don't see that I'm doing anything wrong in my code, and you don't seem to be doing that either, so maybe it's a bug. Are you by any chance running a developer preview of Xcode (4.4 or 4.5)?
Edit: it turns out my problem was that the code block property was being released before it got the chance to run. I think a similar thing might happend in your case, with the alert var. Can you try moving the alert init within the completion block?
Edit 2: As an alternative, try prefixing the alert init with __weak (or __unsafe_unretained if you're targetting iOS 4.3), that should do it. If you're not using ARC, use __block instead.
I'm trying to get a custom sound working on a UILocalNotification, and I'm just getting no sound at all. If I use UILocalNotificationDefaultSoundName, I indeed get the default sound, but when the custom sound is specified, there is no sound, just the message. The sound is less than 30 seconds and it's in the right format, as far as I can tell. Here's a screenshot of the file info:
I've inspected the .app directory in XCode's DerivedData directory, and the alarm.caf file is at the root of the app, which I believe means it's in the bundle (right?).
I'm pretty sure this was working a while ago, and I've since upgraded Xcode. Maybe that is a hint?
I've also tried deleting/reinstalling/rebooting as mentioned in other answers. As you can see, I'm calling cancelAllLocalNotifications first.
Does anyone have any idea what could be wrong?
[[UIApplication sharedApplication] cancelAllLocalNotifications];
NSLog(#"installing alarm");
[arguments pop]; // name
[arguments pop]; // title
alarm.alertBody = [arguments pop];
alarm.fireDate = [[NSDate date] addTimeInterval:[[arguments pop] intValue]/1000];
//alarm.soundName = UILocalNotificationDefaultSoundName;
alarm.soundName = #"alarm.caf";
[[UIApplication sharedApplication] scheduleLocalNotification:alarm];
Your code seems to be good.
Try to clean your project, uninstall your app from your device/simulator, then re-install it. It could help maybe :)
I don't know the reason (and I didn't read documentation too), I just turned on the action property notification setHasAction:YES and the sound began to play.
please make sure that the iPhone is not in silent mode( because your code seems to be good )
just check the button on the side of your iPhone
Ok, so here's what happened. I forgot how the app handles the notification itself if it is still running. My code was only displaying a UIAlertView and not playing the sound. I'm not sure why it worked with the default sound. In any case, I added code like this to my AppDelegate:
- (void)application:(UIApplication *)application
didReceiveLocalNotification:(UILocalNotification *)notification
{
NSLog(#"didReceiveLocalNotification");
if (application.applicationState == UIApplicationStateActive) {
UIAlertView *alertView = [[UIAlertView alloc] initWithTitle:#"MarkMyTime"
message:notification.alertBody
delegate:self cancelButtonTitle:#"OK"
otherButtonTitles:nil];
NSString *soundFilePath = [[NSBundle mainBundle]
pathForResource:notification.soundName ofType:nil];
NSURL *fileURL = [[NSURL alloc] initFileURLWithPath: soundFilePath];
AVAudioPlayer *player = [[AVAudioPlayer alloc] initWithContentsOfURL: fileURL error: nil];
[fileURL release];
player.delegate = self;
[player prepareToPlay];
[player play];
[alertView show];
if (alertView) {
[alertView release];
}
}
}
- (void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
NSLog(#"Releasing player");
[player release];
}
This will show a UIAlertView and play the sound on the notification object. You also need to add the AVAudioPlayerDelegate interface to the AppDelegate to be able to assign the delegat to the player. I think if you are using ARC, this code could be simplified a bit.
#interface AppDelegate : PhoneGapDelegate <AVAudioPlayerDelegate> {
I'm not sure if this is the best approach, so feel free to chime in with any improvements.
Maybe you do not add the sound file (*.caf) in Xcode project: Build Phases/Copy Bundle Resources.
Your code is good, but check your iPhone setting
setting -> Notification center -> Your App -> Sound - > "On"
the sound should be "On".
So, to enable this, checked Inter App Audio at Capabilities in Targets of the application and it was Off Capabilities in Inter-app audio
change this to On.
Then local notification sound is working.
I want to record a video with UIImagePickerController. My Problem is that the method [imagePickerController startVideoCapture]; always returns 0. I am testing the application with an iPhone 4S running iOS 5.1. Could somebody please help me with this:
- (IBAction)playButtonPushed:(id)sender
{
if ([UIImagePickerController isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera])
{
UIView *videoView = self.videoViewController.view;
UIImagePickerController *imagePickerController = [[UIImagePickerController alloc] >init];
imagePickerController.sourceType = UIImagePickerControllerSourceTypeCamera;
imagePickerController.mediaTypes = [[NSArray alloc] initWithObjects: (NSString >*)kUTTypeMovie, nil];
imagePickerController.showsCameraControls = NO;
imagePickerController.toolbarHidden = NO;
imagePickerController.wantsFullScreenLayout = YES;
imagePickerController.allowsEditing = YES;
imagePickerController.videoQuality = UIImagePickerControllerQualityTypeMedium;
imagePickerController.videoMaximumDuration = 30;
[imagePickerController startVideoCapture];
imagePickerController.cameraViewTransform = > CGAffineTransformScale(self.imagePickerViewController.cameraViewTransform, CAMERA_TRANSFORM, CAMERA_TRANSFORM);
imagePickerController.delegate = self;
[self.imagePickerViewController setCameraOverlayView:videoView];
NSLog(#"%s videoCapture: %d", __PRETTY_FUNCTION__, [imagePickerController > startVideoCapture]
);
[self presentModalViewController:imagePickerViewController animated:YES];
}
}
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
NSLog(#"didFinishPickingMediaWithInfo");
NSURL *videoURL = [info objectForKey:UIImagePickerControllerMediaURL];
NSData *videoData = [NSData dataWithContentsOfURL:videoURL];
}
- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker
{
NSLog(#"Cancel");
}
I had the same problem and Apple's documentation didn't help. That's because they missed to mention that startVideoCapture() can return false if you haven't set the capture mode to .video like this:
picker.cameraCaptureMode = .video
This worked for me and I'm able to capture videos now.
If im not mistaken, the "startVideoCapture" method is a bool
Taken straight from apple documentation:
startVideoCapture
Starts video capture using the camera specified by the UIImagePickerControllerCameraDevice property.
- (BOOL)startVideoCapture
Return Value
YES on success or NO on failure. This method may return a value of NO for various reasons, among them the following:
Movie capture is already in progress
The device does not support movie capture
The device is out of disk space
Discussion
Use this method in conjunction with a custom overlay view to initiate the programmatic capture of a movie. You can take more than one movie without leaving the interface, but to do so requires you to hide the default image picker controls.
Calling this method while a movie is being captured has no effect. You must call the stopVideoCapture method, and then wait until the associated delegate object receives an imagePickerController:didFinishPickingMediaWithInfo: message, before you can capture another movie.
Calling this method when the source type of the image picker is set to a value other than UIImagePickerControllerSourceTypeCamera results in the throwing of an NSInvalidArgumentException exception.
If you require additional options or more control over movie capture, use the movie capture methods in the AV Foundation framework. Refer to AV Foundation Framework Reference.
Availability
Available in iOS 4.0 and later.
Declared In
UIImagePickerController.h
stopVideoCapture
Stops video capture.
- (void)stopVideoCapture
Discussion
After you call this method to stop video capture, the system calls the image picker delegate’s imagePickerController:didFinishPickingMediaWithInfo: method.
Availability
Available in iOS 4.0 and later.
Declared In
UIImagePickerController.h
Think you need to call the startVideoCapture method after calling presentModalViewController:animated: and not before.
You probably also need a short delay (<1s?) after the UIImagePickerController has been presented on-screen and the animation has stopped, then call startVideoCapture, otherwise there are times where it will not start recording.
I had exactly the same problem. AndyV has it right, the recording won't start until the picker has been presented. I spent a lot of time trying to introduce a delay with sleep, but the simplest solution is to start a timer after displaying the picker using
NSTimer *timer = [NSTimer scheduledTimerWithTimeInterval: 1.0 target:self selector:#selector(startRecording) userInfo:nil repeats: NO];
Then ..
- (void)startRecording
{
BOOL result = [picker startVideoCapture];
}