Scanning BarCode from image Gallery ios 7 - ios7

I want to scan a barcode or QR code image which is in my photo gallery in ios 7. In ios 7 we can use camera to scan a bar code image but I did not find any method to select a Bar code image from UIImagePickerController and scan it. Is there any methods available in AVFoundation Framework ?
Help me..

I have the same issue, most of the once perfectly running 32 bit barcode SDKs are broken with 7.1 due to the architecture requirement to support arm64. Zbar is affected, ZXing totally got out of the iOS platform what is left are commercial packages. I tried one of them called manatee it works but it is truncating the first character of the barcode from the output. At the moment your best bet are these commercial SDKs working with IOS 7.1 or go back to 7.0 or 6.1 and use Zbar.
AVfoundation solution put forward by #Stark works well with camera capture (I've tested it with some modifications to recognise PDF417, AztecCodes, and 6 or so 1D barcodes),however the codes in the sample app cannot process existing images from the media library. I searched intensively and the nearest bet is the CoreImage detection which does facial recognition on images, unfortunately there is no barcode detection option yet.

There are many APIs available for barcode scanning:-
Softek Barcode Reader SDK
ZBar bar code reader
shopsavvy
red laser
ZXing
If you want to use AVfoundation framework only, here is the link for the tutorial..
http://www.appcoda.com/qr-code-ios-programming-tutorial/
Here is the code which start Reading the barcode
- (BOOL)startReading {
NSError *error;
AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
AVCaptureDeviceInput *input = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
if (!input) {
NSLog(#"%#", [error localizedDescription]);
return NO;
}
_captureSession = [[AVCaptureSession alloc] init];
[_captureSession addInput:input];
AVCaptureMetadataOutput *captureMetadataOutput = [[AVCaptureMetadataOutput alloc] init];
[_captureSession addOutput:captureMetadataOutput];
dispatch_queue_t dispatchQueue;
dispatchQueue = dispatch_queue_create("myQueue", NULL);
[captureMetadataOutput setMetadataObjectsDelegate:self queue:dispatchQueue];
[captureMetadataOutput setMetadataObjectTypes:[NSArray arrayWithObject:AVMetadataObjectTypeQRCode]];
_videoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:_captureSession];
[_videoPreviewLayer setVideoGravity:AVLayerVideoGravityResizeAspectFill];
[_videoPreviewLayer setFrame:_viewPreview.layer.bounds];
[_viewPreview.layer addSublayer:_videoPreviewLayer];
[_captureSession startRunning];
return YES;
}
and to stop it.
-(void)stopReading{
[_captureSession stopRunning];
_captureSession = nil;
[_videoPreviewLayer removeFromSuperlayer];
}

Related

iOS: Deprecation of AudioSessionInitialize and AudioSessionSetProperty

I'm very new to Objective-C, and am trying to update some code that's about 3 years old to work with iOS 7. There are two or two instances of AudioSessionSetProperty and AudioSessionInitialize appearing in the code:
1:
- (void)applicationDidFinishLaunching:(UIApplication *)application {
AudioSessionInitialize(NULL,NULL,NULL,NULL);
[[SCListener sharedListener] listen];
timer = [NSTimer scheduledTimerWithTimeInterval: 0.5 target: self selector: #selector(tick:) userInfo:nil repeats: YES];
// Override point for customization after app launch
[window addSubview:viewController.view];
[window makeKeyAndVisible];
}
And 2:
- (id)init {
if ([super init] == nil){
return nil;
}
AudioSessionInitialize(NULL,NULL,NULL,NULL);
Float64 rate=kSAMPLERATE;
UInt32 size = sizeof(rate);
AudioSessionSetProperty (kAudioSessionProperty_PreferredHardwareSampleRate, size, &rate);
return self;
}
For some reason this code works on iOS7 in the simulator but not a device running iOS7, and I suspect that these deprecations are the cause. I've been reading through the Docs and related questions on this website, and it appears that I need to use AVAudioSession instead. I've been trying to update the code for a long time now, and I'm unsure of how to properly switch over to AVAudioSession. Does anyone know how these two methods above need to look?
Side note: I've managed to hunt down an article that outlines the transition:
https://github.com/software-mariodiana/AudioBufferPlayer/wiki/Replacing-C-functions-deprecated-in-iOS-7
But I can't seem to apply this to the code above.
The code I'm trying to update is a small frequency detection app from git:
https://github.com/jkells/sc_listener
Alternatively, if someone could point me to a sample demo app that can detect frequencies on iOS devices, that would be awesome.
As you have observed, pretty much all of the old Core Audio AudioSession functions have been deprecated in favour of AVAudioSession.
The AVAudioSession is a singleton object which will get initialised when you first call it:
[AVAudioSession sharedInstance]
There is no separate initialize method. But you will want to activate the audio session:
BOOL activated = [[AVAudioSession sharedInstance] setActive:YES error:&error];
As regards setting the hardware sample rate using AVAudioSession, please refer to my answer here:
How can I obtain the native (hardware-supported) audio sampling rates in order to avoid internal sample rate conversion?
For other compares & contrasts between Core Audio audioSession and AVFoundation's AVAudioSession here are some of my other answers around the same topic:
How Do I Route Audio to Speaker without using AudioSessionSetProperty?
use rear microphone of iphone 5
Play audio through upper (phone call) speaker
How to control hardware mic input gain/level on iPhone?
I wrote a short tutorial that discusses how to update to the new AVAudioSession objects. I posted it on GitHub: "Replacing C functions deprecated in iOS 7."

How to draw routes in maps , ios6?

I have been drawing routes in iOS5, but as the app has to be upgraded to iOS6, it does not allow to draw routes between coordinate locations . I have searched a lot, but ended up in vain. Can anyone help me on which api to use in iOS6 since google maps doesnt support anymore ???
EDIT : I have used Google iOS SDK previously to plot the routes which will auto take care of the polylines. As far as I have searched, i will be able to redirect to browser to show routes and navigation. But i need to draw this in-app [ iOS6 ]. Does iOS6 automatically does the routing, if so can u pls share some code to get a gist of it. This code sample I used it in iOS5 using google maps
// over lay map to draw route
routeOverlayView = [[UICRouteOverlayMapView alloc] initWithMapView:routeMapView];
diretions = [UICGDirections sharedDirections];
UICGDirectionsOptions *options = [[UICGDirectionsOptions alloc] init];
//setting travel mode to driving
options.travelMode = UICGTravelModeDriving;
[diretions loadWithStartPoint:startPoint endPoint:endPoint options:options];
// Overlay polylines
UICGPolyline *polyline = [directions polyline];
NSArray *routePoints = [polyline routePoints];
NSLog(#"routePoints %#",routePoints);
[routeOverlayView setRoutes:routePoints];
I did route in my application. I used The Google Geocoding API (used for iOS 6 only) https://developers.google.com/maps/documentation/geocoding/
Something like this:
- (void) sendRequestForLat: (CGFloat) lat lon: (CGFloat) lon
{
NSString* request;
NSInteger zoom = 12;
NSString* location = [NSString stringWithFormat: #"%f,%f", lat, lon];
NSString* daddr = [NSString stringWithFormat: #"%f,%f", myLat, myLon];
request = [NSString stringWithFormat: #"saddr=%#&daddr=%#&zoom=%i&directionsmode=walking", location, daddr, zoom];
NSString* typeMapsApp = isGoogleMapsAppPresent ? #"comgoogle" : #"";
NSString* urlString = [NSString stringWithFormat: #"%#maps://?%#", typeMapsApp, request];
[[UIApplication sharedApplication] openURL: [NSURL URLWithString: urlString]];
}
You're using a third party piece of code to do your route drawing, rather than using MapKit itself. The library you're using (which I found on GitHub here) hasn't been updated for two years.
I'd recommend you move away from that library to something more current. If you want you could consider using Google's iOS Map SDK, which supports iOS 6 (link here). You could combine this with Google's Directions API to draw polylines directly onto the map. Or you could stick with MKMapView, and again draw directly onto the map with polylines.
It's not that iOS has changed how it draws polylines onto a map - it's that the third-party code you're using is out of date.

How to specify the app name user was using to post - (via app name) using SDK 3.1

Using the new Facebook SDK 3.1 and iOS 6 there are 2 (actually 3) ways to post.
(Seems the new trend is to have more options to make it more simple??) OMG!!
Here is one:
SLComposeViewController *fbPost = [SLComposeViewController composeViewControllerForServiceType:SLServiceTypeFacebook];
[fbPost addURL:[NSURL URLWithString:href]];
[self presentViewController:fbPost animated:YES completion:nil];
And this is another way using native dialogs:
[FBNativeDialogs presentShareDialogModallyFrom:self
initialText: nil
image: nil
url: [NSURL URLWithString:href]
handler:^(FBNativeDialogResult result, NSError *error) {
if (error) {
}
else
{
switch (result) {
case FBNativeDialogResultSucceeded:
break;
case FBNativeDialogResultCancelled:
break;
case FBNativeDialogResultError:
break;
}
}
}];
We, developers, think this is cool because we give a nice functionality to the user and also because our app name appears in the post and that can make some promotion of the app.
The funny thing is that latest implementations are not allowing to specify the app name was posting, the name appears after 'via'.
I tried aswell using SLRequest:
ACAccountStore *store = [[ACAccountStore alloc] init];
ACAccountType *fbType = [store accountTypeWithAccountTypeIdentifier:ACAccountTypeIdentifierFacebook];
NSMutableDictionary *options = [[NSMutableDictionary alloc] init];
(options)[#"ACFacebookAppIdKey"] = kFacebookAppID;
(options)[#"ACFacebookPermissionsKey"] = #[#"publish_stream"];
(options)[#"ACFacebookAudienceKey"] = ACFacebookAudienceFriends;
[store requestAccessToAccountsWithType:fbType options:options completion:^(BOOL granted, NSError *error) {
if(granted) {
// Get the list of Twitter accounts.
NSArray *fbAccounts = [store accountsWithAccountType:fbType];
NSMutableDictionary *params = [[NSMutableDictionary alloc] init];
(params)[#"link"] = href;
// (params)[#"picture"] = picture;
// (params)[#"name"] = name;
(params)[#"actions"] = #"{\"name\": \"Go Gabi\", \"link\": \"http://www.gogogabi.com\"}";
//Set twitter API call
SLRequest *postRequest = [SLRequest requestForServiceType:SLServiceTypeFacebook requestMethod:SLRequestMethodPOST
URL:[NSURL URLWithString:#"https://www.facebook.com/dialog/feed"] parameters:params];
//Set account
[postRequest setAccount: [fbAccounts lastObject]];
[postRequest performRequestWithHandler:^(NSData *responseData, NSHTTPURLResponse *urlResponse, NSError *error) {
if(error)
{
NSLog(#"%#", error.description);
}
else
{
NSLog(#"%#", [[NSString alloc] initWithData:responseData encoding:NSUTF8StringEncoding]);
}
}];
} else {
}
}];
Unfortunatelly to share that name is not so trivial anymore, I wonder why and who was designing the new implementation...
I would appreciate to get some help on that, thanks in advance.
I try to make my questions funny because is soo boring spend time in so trivial topics...
When you use the SLComposeViewController, it's actually the system presenting to you their controller, and it's the user who sends using the post button. Therefore on Facebook it appears as "via iOS".
There's no way to change that.
Using the Facebook SDK 3.1, under the hood it is also using the iOS 6 native integration, so when you're calling the FBNativeDialogs, on iOS 6, it's using SLComposeViewController.
Facebook continued to develop their SDK because they provide a couple of nice modules to use "out of the box" - this includes friends list selector etc... But I believe the biggest reason for Facebook to continue supporting their SDK it for backward compatibility. Under the hood if you're not on iOS 6, it falls back to it's library, and if you are on iOS 6, it uses the system integration.
Facebook is a big thing, and now it's natively available a lot of developers will be using it, just like Twitter's integration last year. The problem of course is at that point the developer has the option to drop older iOS support, or... have a lot of duplicate code, in the sense that they will check for SLComposeViewController and if it's not available (iOS 5) then use the old Facebook SDK... You can imagine how this would become very messy very quickly.
So, the Facebook SDK (3.1) is using iOS system Facebook integration if available, or if not, it's own. In a nutshell, unless you really want the Facebook SDK goodies (friend picket to name one), and you're not planning on supporting iOS < 6 then you don't need to worry about their SDK, just use the Social framework.
So, back to your question, there are 3 ways to post to Facebook ? Actually taking into consideration what I mentioned, there are 2 ways in iOS 6: SLComposeViewController or, SLRequest. On older iOS versions, only 1: Facebook SDK.
Since the SLComposeViewController is owned by the system, not your app, it will always share as "via iOS".
On the other hand SLRequest will show your apps name. When you specify an account for your SLRequest, that account was acquired via the ACAccountStore as a result of passing in some options including ACFacebookAppIdKey, which will be used to determine your Facebook apps name to post onto the users feed as part of the post.
Hope this helps.

UIImageWriteToSavedPhotosAlbum and ALAssetsLibrary not saving an image, no error either

I am trying to save an image to the camera roll. This actually used to work wonderfully, but I had to work on other stuff and now I'm returning to the project to update it for iOS 6 and poof this feature no longer works at all on iOS6.
I have tried two approaches, both are failing silently without NSError objects. First, UIImageWriteToSavedPhotosAlbum:
UIImageWriteToSavedPhotosAlbum(img, self, #selector(image:didFinishSavingWithError:contextInfo:), nil);
// Callback
-(void)image:(UIImage *)image didFinishSavingWithError:(NSError *)error contextInfo:(void *)contextInfo
{
// error == nil
}
... and the ALAssetsLibrary approach:
ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[img CGImage]
orientation:(ALAssetOrientation)[img imageOrientation]
completionBlock:^(NSURL *assetURL, NSError *error)
{
// assetURL == nil
// error == nil
}
Also, [ALAssetsLibrary authorizationStatus] == ALAuthorizationStatusAuthorized evaluates to true
On the Simulator, the app never shows up in the Settings > Privacy > Photos section, however on an actual iPad they do show that the app has permission to access photos. (Also, just to add: The first approach above was what I previously used - it worked on real devices & simulators alike, no problem).
I have also tried running this on the main thread to see if that changed anything - no difference. I was running it on the background previously and it used to work fine (on both simulator and device).
Can anyone shed some light?
Figured it out... I was doing something stupid. UIImage cannot take raw pixel data, you have to first massage it into a form it can accept, with the proper metadata.
Part of the problem was that I was using Cocos2D to get a UIImage from a CCRenderTexture (getUIImageFromBuffer()) and when I switched to Cocos2D-x that function was no longer available, and I simply was ignorant to the fact that UIImage objects cannot be constructed with raw pixel data, I figured it handled header information & formatting automatically.
This answer helped: iPhone - UIImage imageWithData returning nil
And this example was also helpful:
http://www.wmdeveloper.com/2010/09/create-bitmap-graphics-context-on.html?m=1

Iphone SDK 4 sms composer

Have anyone tried to use the SDK4's SMS composer?
If anyone's got some reference or source code please put in here
Thanks
If you want to support 3.1 devices, you need to do a few things:
In your target's build settings:
set Base SDK to iPhone Device 4.0
set iPhone OS Deployment Target to iPhone OS 3.x (the lowest OS level you want to support)
In your target's general settings, under Linked Libraries, change the "Type" next to MessageUI.framework to Weak.
Don't import <MessageUI/MFMessageComposeViewController.h> or it will crash on launch on 3.1. Just import <MessageUI/MessageUI.h>
To make sure it doesn't crash on 3.1.x, you need to test for the availability of MFMessageComposeViewController:
Class smsClass = (NSClassFromString(#"MFMessageComposeViewController"));
if (smsClass != nil && [MFMessageComposeViewController canSendText]) {
MFMessageComposeViewController *controller = [[MFMessageComposeViewController alloc] init];
controller.body = text;
controller.recipients = [NSArray arrayWithObjects: nil];
controller.messageComposeDelegate = self;
[self presentModalViewController:controller animated:YES];
[controller release];
}
If you've got the 4.0 SDK already, check MFMessageComposeViewController. The usage is similar to MFMailComposeViewController.