MKDirectionsResponse returns nil always - ios7

I apologizes if this question is repeated but i had hard time trying to display route between two places using MKDirectionsRequest.
MKDirectionsRequest *request = [[MKDirectionsRequest alloc] init];
request.source = [MKMapItem mapItemForCurrentLocation];
request.transportType = MKDirectionsTransportTypeAny;
request.destination = _dstItem;
request.requestsAlternateRoutes = YES;
MKDirections *directions = [[MKDirections alloc] initWithRequest:request];
// __block typeof(self) weakSelf = self;
[directions calculateDirectionsWithCompletionHandler:
^(MKDirectionsResponse *response, NSError *error) {
//stop loading animation here
if (error) {
NSLog(#"Error is %#",error);
} else {
//do something about the response, like draw it on map
MKRoute *route = [response.routes firstObject];
[self.mapView addOverlay:route.polyline level:MKOverlayLevelAboveRoads];
}
}];
MKDirectionsResponse alway returns nil with error description
Error Domain=MKErrorDomain Code=5 "Directions Not Available"
UserInfo=0x1700f1c80 {NSLocalizedFailureReason=A route to the nearest road cannot be determined.,
MKErrorGEOError=-403, MKDirectionsErrorCode=6, NSLocalizedDescription=Directions Not Available}

MKDirectionsRequest is the right api to find route between two points. It seems Apple has not added route feature for India yet.

Error Domain=MKErrorDomain Code=5 "Directions Not Available"
You are most likely to get this error if the location doesn't belong to any of the countries in this list: http://www.apple.com/ios/feature-availability/#maps-directions
While on iOS simulator, you can easily customize your current location. Two ways to do it:
iOS Simulator -> 'Debug' tab -> Location -> {Choose}
Xcode -> 'Debug' tab -> Simulate Location -> {Choose}

i am getting same error message .if you are providing lat long of india it will not work because apple has not added rout feature for india yet try with different lat long hope it will work.i tried same and it is working for me.

I was getting same Error. I still do not know the best solution yet. But I created one button on map with refresh and called the method i was calling from viewDidLoad() as I figured, my method is returning immediately with error before even calculating the address from viewDidLoad().
My guess is, viewDidLoad() should get finished in some specific predefined time.
So,
<hr/> `-(IBAction)refreshRoute:(id)sender {
[self routeCalculate];
}`
<hr/>
instead of
<hr/>
-(void)viewDidLoad {
[super viewDidLoad];
_manager = [[CLLocationManager alloc]init];
[_manager requestWhenInUseAuthorization];
[_manager requestAlwaysAuthorization];
_manager.desiredAccuracy = kCLLocationAccuracyBest;
_manager.distanceFilter =kCLDistanceFilterNone;
[_manager requestWhenInUseAuthorization];
[_manager requestAlwaysAuthorization];
[_manager startUpdatingLocation];
[_manager delegate];
self.pinMap.delegate = self;
// [self routeCalculate]; ***** This Call ****
}
<hr/>

Related

ABAddressBookRequestAccessWithCompletion crashes on iOS 10

I use below code to access contacts in my iOS application. It was working fine in iOS<10 but with Xcode 8 and iOS 10 it crashes:
- (void)btcContacts_tap {
ABAddressBookRef addressBook = ABAddressBookCreateWithOptions(NULL, NULL);
ABAddressBookRequestAccessWithCompletion(addressBook, ^(bool granted, CFErrorRef error) {
if (granted) {
_addressBookController = [[ABPeoplePickerNavigationController alloc] init];
[[_addressBookController navigationBar] setBarStyle:UIBarStyleBlack];
_addressBookController.delegate = self;
[_addressBookController setPredicateForEnablingPerson:[NSPredicate predicateWithFormat:#"%K.#count > 0", ABPersonPhoneNumbersProperty]];
[_addressBookController setPeoplePickerDelegate:self];
[self presentViewController:_addressBookController animated:YES completion:nil];
}
else {
dispatch_async(dispatch_get_main_queue(), ^{
[self showMessage:NSLocalizedStringFromTable(#"PLEASE_GRANT_CONTACTS", LIApplicationLanguage(), nil) andAdvertise:#"" andService:nil andTransactionState:kTTTransactionStateInfo];
});
}
});
}
I have set NSSetUncaughtExceptionHandler to a method for logging the crash report but even the exception handler is not calling...
Does someone else faced this problem too?
iOS 10:
You need to put the NSContactsUsageDescription in your plist. Like:
<key>NSContactsUsageDescription</key>
<string>$(PRODUCT_NAME) uses photos</string>
See all usage descriptions here.
Use CNContactStore , ABAddressBookRequestAccessWithCompletion is depreciated.
enter link description here

Game Center turn-based match data is not saved and/or read

I am planning to develop a turn-based game and is trying to understand how to communicate with Game Center and send and receive mach data. I have read about it and tested this for days now and just cannot get it to work as planned.
The only thing i try to do with the code below is to be able to save and then read the mach data. I am using two sandbox Game Center accounts for the turns.
The turns are sending the same data by pressing "endTurn" button. Every time i run the actual user is authenticated and the app is set up correctly (i believe).
This is a test app without any other purpose than test what i stated. Below is the code i use for the match data processing.
I would really appreciate any ideas and tips on what i may do wrong. Before i started serious testing i did post a similar question but that did not solve this problem, https://stackoverflow.com/questions/14447392/start-gamecenter-turn-based-match-and-initiate-match-data-for-the-very-first-tim.
I also try to catch the participants but with no success, which may mean that it is the problem when processing the completionhandler.
-(IBAction)endTurn:(id)sender {
[_gameDictionary setObject:#"The Object" forKey:#"The Key"];
NSLog(#"_gameDictionary: %#", _gameDictionary);
NSData *data = [NSPropertyListSerialization dataFromPropertyList:_gameDictionary format:NSPropertyListXMLFormat_v1_0 errorDescription:nil];
GKTurnBasedParticipant *nextPlayer;
if (_match.currentParticipant == [_match.participants objectAtIndex:0]) {
nextPlayer = [[_match participants] lastObject];
} else {
nextPlayer = [[_match participants]objectAtIndex:0];
}
NSLog(#"_match.currentParticipant: %#", _match.currentParticipant);
[self.match endTurnWithNextParticipant:nextPlayer matchData:data completionHandler:^(NSError *error) {
if (error) {
NSLog(#"An error occured updating turn: %#", [error localizedDescription]);
}
[self.navigationController popViewControllerAnimated:YES];
}];
}
- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil {
self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
if (self) {
// Custom initialization
}
return self;
}
- (void)viewDidLoad {
[super viewDidLoad];
_gameDictionary = [[NSMutableDictionary alloc]init];
[self.match loadMatchDataWithCompletionHandler:^(NSData *matchData, NSError *error) {
NSDictionary *myDict = [NSPropertyListSerialization propertyListFromData:_match.matchData mutabilityOption:NSPropertyListImmutable format:nil errorDescription:nil];
[_gameDictionary addEntriesFromDictionary: myDict];
if (error) {
NSLog(#"loadMatchData - %#", [error localizedDescription]);
}
}];
NSLog(#"_gameDictionary: %#", _gameDictionary);
}
Output:
"gk-cdx" = "17.173.254.218:4398";
"gk-commnat-cohort" = "17.173.254.220:16386";
"gk-commnat-main0" = "17.173.254.219:16384";
"gk-commnat-main1" = "17.173.254.219:16385";
}
2013-02-11 22:44:11.707 GC_test1[8791:14f03] _gameDictionary: {
}
2013-02-11 22:44:13.894 GC_test1[8791:14f03] _gameDictionary: {
The Object = The Key;
}
2013-02-11 22:44:13.894 GC_test1[8791:14f03] _match.currentParticipant: (null)
The fact that _match.currentParticipant evaluates to nil is troubling. I suspect that _match was never initialized, or is nil, or that it was not obtained from a Game Center facility such as loadMatchesWithCompletionHandler:, the GKTurnBasedMatchmakerViewController, or using findMatchForRequest:withCompletionHandler:.
For a new match, if created through any of these facilities, currentParticipant would be guaranteed to represent the local player. You are not allowed to instantiate a GKTurnBasedMatch yourself.
To resolve this issue at least for testing, you could assign a new _match from within the completion handler of findMatchForRequest:withCompletionHandler:. Only then should you be allowed to press your test button.

Why does this code work sometimes, but not others?

I created a 'mirror'-like view in my app that uses the front camera to show a 'mirror' to the user. The problem I'm having is that I have not touched this code in weeks (and it did work then) but now I'm testing it again and it's not working. The code is the same as before, there are no errors coming up, and the view in the storyboard is exactly the same as before. I have no idea what is going on, so I was hoping that this website would help.
Here is my code:
if([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceFront]) {
//If the front camera is available, show the camera
AVCaptureSession *session = [[AVCaptureSession alloc] init];
AVCaptureOutput *output = [[AVCaptureStillImageOutput alloc] init];
[session addOutput:output];
//Setup camera input
NSArray *possibleDevices = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
//You could check for front or back camera here, but for simplicity just grab the first device
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
NSError *error = nil;
// create an input and add it to the session
AVCaptureDeviceInput* input = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error]; //Handle errors
//set the session preset
session.sessionPreset = AVCaptureSessionPresetHigh; //Or other preset supported by the input device
[session addInput:input];
AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:session];
//Now you can add this layer to a view of your view controller
[cameraView.layer addSublayer:previewLayer];
previewLayer.frame = self.cameraView.bounds;
[session startRunning];
if ([session isRunning]) {
NSLog(#"The session is running");
}
if ([session isInterrupted]) {
NSLog(#"The session has been interupted");
}
} else {
//Tell the user they don't have a front facing camera
}
Thank You in advanced.
Not sure if this is the problem but there is an inconsistency between your code and the comments. The inconsistency is with the following line of code:
AVCaptureDevice *device = [possibleDevices objectAtIndex:1];
In the comment above it says: "...for simplicity just grab the first device". However, the code is grabbing the second device, NSArray is indexed from 0. I believe the comment should be corrected as I think you are assuming the front camera will be the second device in the array.
If you are working on the assumption that the first device is the back camera and the second device is the front camera then this is a dangerous assumption. It would be much safer and more future proof to check the list of possibleDevices for the device that is the front camera.
The following code will enumerate the list of possibleDevices and create input using the front camera.
// Find the front camera and create an input and add it to the session
AVCaptureDeviceInput* input = nil;
for(AVCaptureDevice *device in possibleDevices) {
if ([device position] == AVCaptureDevicePositionFront) {
NSError *error = nil;
input = [AVCaptureDeviceInput deviceInputWithDevice:device
error:&error]; //Handle errors
break;
}
}
Update: I have just cut and pasted the code exactly as it is in the question into a simple project and it is working fine for me. I am seeing the video from the front camera. You should probably look elsewhere for the issue. First, I would be inclined to check the cameraView and associated layers.

Play a paused AVAudioRecorder file

in my program I want the user to be able to:
record his voice,
pause the recording process,
listen to what he recorded
and then continue recording.
I have managed to get to the point where I can record and play the recordings with AVAudioRecorder and AVAudioPlayer. But whenever I try to record, pause recording and then play, the playing part fails with no error.
I can guess that the reason it's not playing is because the audio file hasn't been saved yet and is still in memory or something.
Is there a way I can play paused recordings?
If there is please tell me how
I'm using xcode 4.3.2
If you want to play the recording, then yes you have to stop recording before you can load the file into the AVAudioPlayer instance.
If you want to be able to playback some of the recording, then add more to the recording after listening to it, or say record in the middle.. then you're in for some trouble.
You have to create a new audio file and then combine them together.
This was my solution:
// Generate a composition of the two audio assets that will be combined into
// a single track
AVMutableComposition* composition = [AVMutableComposition composition];
AVMutableCompositionTrack* audioTrack = [composition addMutableTrackWithMediaType:AVMediaTypeAudio
preferredTrackID:kCMPersistentTrackID_Invalid];
// grab the two audio assets as AVURLAssets according to the file paths
AVURLAsset* masterAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.masterFile] options:nil];
AVURLAsset* activeAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:self.newRecording] options:nil];
NSError* error = nil;
// grab the portion of interest from the master asset
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, masterAsset.duration)
ofTrack:[[masterAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:kCMTimeZero
error:&error];
if (error)
{
// report the error
return;
}
// append the entirety of the active recording
[audioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, activeAsset.duration)
ofTrack:[[activeAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]
atTime:masterAsset.duration
error:&error];
if (error)
{
// report the error
return;
}
// now export the two files
// create the export session
// no need for a retain here, the session will be retained by the
// completion handler since it is referenced there
AVAssetExportSession* exportSession = [AVAssetExportSession
exportSessionWithAsset:composition
presetName:AVAssetExportPresetAppleM4A];
if (nil == exportSession)
{
// report the error
return;
}
NSString* combined = #"combined file path";// create a new file for the combined file
// configure export session output with all our parameters
exportSession.outputURL = [NSURL fileURLWithPath:combined]; // output path
exportSession.outputFileType = AVFileTypeAppleM4A; // output file type
[exportSession exportAsynchronouslyWithCompletionHandler:^{
// export status changed, check to see if it's done, errored, waiting, etc
switch (exportSession.status)
{
case AVAssetExportSessionStatusFailed:
break;
case AVAssetExportSessionStatusCompleted:
break;
case AVAssetExportSessionStatusWaiting:
break;
default:
break;
}
NSError* error = nil;
// your code for dealing with the now combined file
}];
I can't take full credit for this work, but it was pieced together from the input of a couple of others:
AVAudioRecorder / AVAudioPlayer - append recording to file
(I can't find the other link at the moment)
We had the same requirements for our app as the OP described, and ran into the same issues (i.e., the recording has to be stopped, instead of paused, if the user wants to listen to what she has recorded up to that point). Our app (project's Github repo) uses AVQueuePlayer for playback and a method similar to kermitology's answer to concatenate the partial recordings, with some notable differences:
implemented in Swift
concatenates multiple recordings into one
no messing with tracks
The rationale behind the last item is that simple recordings with AVAudioRecorder will have one track, and the main reason for this whole workaround is to concatenate those single tracks in the assets (see Addendum 3). So why not use AVMutableComposition's insertTimeRange method instead, that takes an AVAsset instead of an AVAssetTrack?
Relevant parts: (full code)
import UIKit
import AVFoundation
class RecordViewController: UIViewController {
/* App allows volunteers to record newspaper articles for the
blind and print-impaired, hence the name.
*/
var articleChunks = [AVURLAsset]()
func concatChunks() {
let composition = AVMutableComposition()
/* `CMTimeRange` to store total duration and know when to
insert subsequent assets.
*/
var insertAt = CMTimeRange(start: kCMTimeZero, end: kCMTimeZero)
repeat {
let asset = self.articleChunks.removeFirst()
let assetTimeRange =
CMTimeRange(start: kCMTimeZero, end: asset.duration)
do {
try composition.insertTimeRange(assetTimeRange,
of: asset,
at: insertAt.end)
} catch {
NSLog("Unable to compose asset track.")
}
let nextDuration = insertAt.duration + assetTimeRange.duration
insertAt = CMTimeRange(start: kCMTimeZero, duration: nextDuration)
} while self.articleChunks.count != 0
let exportSession =
AVAssetExportSession(
asset: composition,
presetName: AVAssetExportPresetAppleM4A)
exportSession?.outputFileType = AVFileType.m4a
exportSession?.outputURL = /* create URL for output */
// exportSession?.metadata = ...
exportSession?.exportAsynchronously {
switch exportSession?.status {
case .unknown?: break
case .waiting?: break
case .exporting?: break
case .completed?: break
case .failed?: break
case .cancelled?: break
case .none: break
}
}
/* Clean up (delete partial recordings, etc.) */
}
This diagram helped me to get around what expects what and inherited from where. (NSObject is implicitly implied as superclass where there is no inheritance arrow.)
Addendum 1: I had my reservations regarding the switch part instead of using KVO on AVAssetExportSessionStatus, but the docs are clear that exportAsynchronously's callback block "is invoked when writing is complete or in the event of writing failure".
Addendum 2: Just in case if someone has issues with AVQueuePlayer: 'An AVPlayerItem cannot be associated with more than one instance of AVPlayer'
Addendum 3: Unless you are recording in stereo, but mobile devices have one input as far as I know. Also, using fancy audio mixing would also require the use of AVCompositionTrack. A good SO thread: Proper AVAudioRecorder Settings for Recording Voice?
RecordAudioViewController.h
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <CoreAudio/CoreAudioTypes.h>
#interface record_audio_testViewController : UIViewController <AVAudioRecorderDelegate> {
IBOutlet UIButton * btnStart;
IBOutlet UIButton * btnPlay;
IBOutlet UIActivityIndicatorView * actSpinner;
BOOL toggle;
//Variables setup for access in the class:
NSURL * recordedTmpFile;
AVAudioRecorder * recorder;
NSError * error;
}
#property (nonatomic,retain)IBOutlet UIActivityIndicatorView * actSpinner;
#property (nonatomic,retain)IBOutlet UIButton * btnStart;
#property (nonatomic,retain)IBOutlet UIButton * btnPlay;
- (IBAction) start_button_pressed;
- (IBAction) play_button_pressed;
#end
RecordAudioViewController.m
#synthesize actSpinner, btnStart, btnPlay;
- (void)viewDidLoad {
[super viewDidLoad];
//Start the toggle in true mode.
toggle = YES;
btnPlay.hidden = YES;
//Instanciate an instance of the AVAudioSession object.
AVAudioSession * audioSession = [AVAudioSession sharedInstance];
//Setup the audioSession for playback and record.
//We could just use record and then switch it to playback leter, but
//since we are going to do both lets set it up once.
[audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error: &error];
//Activate the session
[audioSession setActive:YES error: &error];
}
- (IBAction) start_button_pressed{
if(toggle)
{
toggle = NO;
[actSpinner startAnimating];
[btnStart setTitle:#"Stop Recording" forState: UIControlStateNormal ];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
//Begin the recording session.
//Error handling removed. Please add to your own code.
//Setup the dictionary object with all the recording settings that this
//Recording sessoin will use
//Its not clear to me which of these are required and which are the bare minimum.
//This is a good resource: http://www.totodotnet.net/tag/avaudiorecorder/
NSMutableDictionary* recordSetting = [[NSMutableDictionary alloc] init];
[recordSetting setValue :[NSNumber numberWithInt:kAudioFormatAppleIMA4] forKey:AVFormatIDKey];
[recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];
[recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];
//Now that we have our settings we are going to instanciate an instance of our recorder instance.
//Generate a temp file for use by the recording.
//This sample was one I found online and seems to be a good choice for making a tmp file that
//will not overwrite an existing one.
//I know this is a mess of collapsed things into 1 call. I can break it out if need be.
recordedTmpFile = [NSURL fileURLWithPath:[NSTemporaryDirectory() stringByAppendingPathComponent: [NSString stringWithFormat: #"%.0f.%#", [NSDate timeIntervalSinceReferenceDate] * 1000.0, #"caf"]]];
NSLog(#"Using File called: %#",recordedTmpFile);
//Setup the recorder to use this file and record to it.
recorder = [[ AVAudioRecorder alloc] initWithURL:recordedTmpFile settings:recordSetting error:&error];
//Use the recorder to start the recording.
//Im not sure why we set the delegate to self yet.
//Found this in antother example, but Im fuzzy on this still.
[recorder setDelegate:self];
//We call this to start the recording process and initialize
//the subsstems so that when we actually say "record" it starts right away.
[recorder prepareToRecord];
//Start the actual Recording
[recorder record];
//There is an optional method for doing the recording for a limited time see
//[recorder recordForDuration:(NSTimeInterval) 10]
}
else
{
toggle = YES;
[actSpinner stopAnimating];
[btnStart setTitle:#"Start Recording" forState:UIControlStateNormal ];
btnPlay.enabled = toggle;
btnPlay.hidden = !toggle;
NSLog(#"Using File called: %#",recordedTmpFile);
//Stop the recorder.
[recorder stop];
}
}
- (void)didReceiveMemoryWarning {
// Releases the view if it doesn't have a superview.
[super didReceiveMemoryWarning];
// Release any cached data, images, etc that aren't in use.
}
-(IBAction) play_button_pressed{
//The play button was pressed...
//Setup the AVAudioPlayer to play the file that we just recorded.
AVAudioPlayer * avPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:recordedTmpFile error:&error];
[avPlayer prepareToPlay];
[avPlayer play];
}
- (void)viewDidUnload {
// Release any retained subviews of the main view.
// e.g. self.myOutlet = nil;
//Clean up the temp file.
NSFileManager * fm = [NSFileManager defaultManager];
[fm removeItemAtPath:[recordedTmpFile path] error:&error];
//Call the dealloc on the remaining objects.
[recorder dealloc];
recorder = nil;
recordedTmpFile = nil;
}
- (void)dealloc {
[super dealloc];
}
#end
RecordAudioViewController.xib
take 2 Buttons. 1 for begin recording and another for Play recording

Using MKReverseGeocoder in a singleton class (ARC)

I'm in the process of trying to create a singleton class for weather functions (all encompassed) so that I can change/update/recall weather data throughout my entire app in a single allocated object.
I have it working except for one little weird bug. I'm using MKReverseGeocoder on devices running iOS < 5. CLGeocoder works fine on iOS 5+. Basically, this is what happens in the app:
In the app delegate, at launch, I create the shared instance of my weather singleton. This instance doesn't do anything unless there is a UIView that expects to report weather results. When a view that reports weather is loaded, if the location isn't already statically set in the preferences, the app attempts to pull your current location.
This works, and I can log the coordinates of the devices current location. After that is done, I immediately go into trying to reverse geolocate those coordinates. The method MKReverseGeocoder start does get executed and I can log that the instance's isQuerying property is true, so I know it's attempting to geolocate the coords. (Yes, I set the delegate as my shared instance and the instance is of type MKReverseGeocoderDelegate).
Now this is the weird part. If I am launching the app for the first time, and I add a weather UIView to the screen for the first time, MKReverseGeocoder starts but never calls the delegate method. If I then close the app and open it again (second time), the current location get's looked up, and MKReverseGeocoder does call the delegate methods and everything works. It just doesn't want to work on the first launch, no matter how many times I call it (I have a button that can initiate the lookup).
It's totally baffling. The iOS 5 CLGeocoder works fine on initial launch and every subsequent launch. the MKReverseGeocoder does not work (because it doesn't call the delegate methods) on initial launch, but does on subsequent launches.
Below is the relevant code:
-(void)reverseGeocoder:(MKReverseGeocoder *)geocoder didFailWithError:(NSError *)error{
NSLog(#"error getting location:%#", error);
self.reverseGeocoder = nil;
[[NSNotificationCenter defaultCenter] postNotificationName:#"errorGettingLocation" object:nil userInfo:[NSDictionary dictionaryWithObject:error forKey:#"error"]];
}
- (void)reverseGeocoder:(MKReverseGeocoder *)geocoder didFindPlacemark:(MKPlacemark *)pm
{
//update placemark and get weather from correct placemark
placemark = pm;
NSLog(#"singleton placemark: %#",placemark);
[self getLocationFromPlacemark];
self.reverseGeocoder = nil;
}
- (void) getReverseGeoCode:(CLLocation *)newLocation
{
NSLog(#"reversGeocode starting for location: %#", newLocation);
NSString *ver = [[UIDevice currentDevice] systemVersion];
float ver_float = [ver floatValue];
if (ver_float < 5.0) {
//reverseGeocoder = nil;
self.reverseGeocoder = [[MKReverseGeocoder alloc] initWithCoordinate:newLocation.coordinate];
self.reverseGeocoder.delegate = self;
[self.reverseGeocoder start];
if(self.reverseGeocoder.isQuerying){
NSLog(#"self.reverseGeocoder querying");
NSLog(#"self.reverseGeocoder delegate %#", self.reverseGeocoder.delegate);
NSLog(#"self %#", self);
}else {
NSLog(#"geocoder not querying");
}
}
else {
[reverseGeocoder5 reverseGeocodeLocation:newLocation completionHandler:^(NSArray *placemarks, NSError *error){
if([placemarks count]>0){
placemark = [placemarks objectAtIndex:0];
[self getLocationFromPlacemark];
}
else{
if (error) {
NSLog(#"error reverseGeocode: %#",[error localizedDescription]);
}
}
}];
}
}
Also, I am setting reverserGeocoder as a (nonatomic, strong) property and synthesizing it. Remember, this works fine after a clean launch the second time (when there is a weather UIView already loaded). The calls to log aren't even getting hit (which is why I'm assuming the delegate methods aren't getting hit).
Any input would be GREATLY appreciated!
Thanks!
I ended up figuring this out... well, kind of.
Instead of using MKReverseGeocoder, I'm just doing a static lookup to google maps ala How to deal with MKReverseGeocoder / PBHTTPStatusCode=503 errors in iOS 4.3?
Works perfectly.
MKReverseGeocoder doesn't exist anymore... in iOS 5
You have to import CoreLocation and use CLGeocoder. Below is just some example sample code :
CLGeocoder *geoCoder = [[CLGeocoder alloc] init];
CLLocation *location = [[CLLocation alloc] initWithLatitude:37.33188 longitude:-122.029497];
[geoCoder reverseGeocodeLocation:location completionHandler:^(NSArray *placemark, NSError *error) {
CLPlacemark *place = [placemark objectAtIndex:0];
if (error) {
NSLog(#"fail %#", error);
} else {
NSLog(#"return %#", place.addressDictionary);
}
}];